diff --git a/README.md b/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..7e2ef0551e49fb5f043a82a07566ab17c38e57e6
--- /dev/null
+++ b/README.md
@@ -0,0 +1,143 @@
+---
+library_name: peft
+license: llama3.1
+base_model: meta-llama/Llama-3.1-8B-Instruct
+tags:
+- generated_from_trainer
+datasets:
+- ugaoo/medmcqa_lm_harness_10k_ts
+model-index:
+- name: out/meta_llama_Llama_3.1_8B_Instruct_ugaoo_medmcqa_lm_harness_10k_ts
+ results: []
+---
+
+
+
+[
](https://github.com/axolotl-ai-cloud/axolotl)
+See axolotl config
+
+axolotl version: `0.8.0.dev0`
+```yaml
+base_model: meta-llama/Llama-3.1-8B-Instruct
+model_type: AutoModelForCausalLM
+tokenizer_type: AutoTokenizer
+trust_remote_code: true
+
+load_in_8bit: false
+load_in_4bit: true
+strict: false
+
+datasets:
+ - path: ugaoo/medmcqa_lm_harness_10k_ts
+ type: alpaca
+val_set_size: 0
+output_dir: ./out/meta_llama_Llama_3.1_8B_Instruct_ugaoo_medmcqa_lm_harness_10k_ts
+
+sequence_len: 4000
+sample_packing: true
+pad_to_sequence_len: true
+
+adapter: qlora
+lora_r: 256
+lora_alpha: 512
+lora_dropout: 0.05
+lora_target_linear: true
+lora_target_modules:
+ - q_proj
+ - k_proj
+ - v_proj
+ - o_proj
+ - up_proj
+ - down_proj
+ - gate_proj
+lora_modules_to_save:
+ - embed_tokens
+ - lm_head
+
+wandb_project: testsearch
+wandb_entity:
+wandb_watch:
+wandb_name: meta_llama_Llama_3.1_8B_Instruct_ugaoo_medmcqa_lm_harness_10k_ts
+wandb_log_model:
+
+gradient_accumulation_steps: 3
+micro_batch_size: 4
+num_epochs: 6
+optimizer: adamw_torch
+lr_scheduler: cosine
+learning_rate: 5e-6
+
+train_on_inputs: false
+group_by_length: false
+bf16: auto
+fp16: false
+tf32: false
+
+gradient_checkpointing: true
+early_stopping_patience:
+resume_from_checkpoint:
+logging_steps: 1
+xformers_attention:
+flash_attention: true
+
+warmup_steps: 100
+evals_per_epoch: 6
+eval_table_size:
+saves_per_epoch: 1
+debug:
+deepspeed:
+weight_decay: 0.0
+fsdp:
+fsdp_config:
+save_total_limit: 6
+special_tokens:
+ pad_token: <|end_of_text|>
+```
+
+
+
+# out/meta_llama_Llama_3.1_8B_Instruct_ugaoo_medmcqa_lm_harness_10k_ts
+
+This model is a fine-tuned version of [meta-llama/Llama-3.1-8B-Instruct](https://huggingface.co/meta-llama/Llama-3.1-8B-Instruct) on the ugaoo/medmcqa_lm_harness_10k_ts dataset.
+
+## Model description
+
+More information needed
+
+## Intended uses & limitations
+
+More information needed
+
+## Training and evaluation data
+
+More information needed
+
+## Training procedure
+
+### Training hyperparameters
+
+The following hyperparameters were used during training:
+- learning_rate: 5e-06
+- train_batch_size: 4
+- eval_batch_size: 4
+- seed: 42
+- distributed_type: multi-GPU
+- gradient_accumulation_steps: 3
+- total_train_batch_size: 12
+- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
+- lr_scheduler_type: cosine
+- lr_scheduler_warmup_steps: 100
+- num_epochs: 6.0
+
+### Training results
+
+
+
+### Framework versions
+
+- PEFT 0.14.0
+- Transformers 4.49.0
+- Pytorch 2.5.1+cu124
+- Datasets 3.2.0
+- Tokenizers 0.21.0
\ No newline at end of file
diff --git a/adapter_config.json b/adapter_config.json
index e4bab13e98c4d785ef78a84a367065e3d5d84c5a..9dfb3ab60881d002c4cdbcc157a93958018fe683 100644
--- a/adapter_config.json
+++ b/adapter_config.json
@@ -6,7 +6,7 @@
"eva_config": null,
"exclude_modules": null,
"fan_in_fan_out": null,
- "inference_mode": false,
+ "inference_mode": true,
"init_lora_weights": true,
"layer_replication": null,
"layers_pattern": null,
diff --git a/adapter_model.safetensors b/adapter_model.safetensors
new file mode 100644
index 0000000000000000000000000000000000000000..0bef38c297c36de0879e0a49295516870d706c7b
--- /dev/null
+++ b/adapter_model.safetensors
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:6d78ee61e25a9d389b642e2bc6f05ef7dcbc5f6ac8b35064353554607b639beb
+size 3443586272
diff --git a/checkpoint-170/README.md b/checkpoint-170/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..be5c87703f12b547886cc6a2ecfbe9ee150496fa
--- /dev/null
+++ b/checkpoint-170/README.md
@@ -0,0 +1,202 @@
+---
+base_model: meta-llama/Llama-3.1-8B-Instruct
+library_name: peft
+---
+
+# Model Card for Model ID
+
+
+
+
+
+## Model Details
+
+### Model Description
+
+
+
+
+
+- **Developed by:** [More Information Needed]
+- **Funded by [optional]:** [More Information Needed]
+- **Shared by [optional]:** [More Information Needed]
+- **Model type:** [More Information Needed]
+- **Language(s) (NLP):** [More Information Needed]
+- **License:** [More Information Needed]
+- **Finetuned from model [optional]:** [More Information Needed]
+
+### Model Sources [optional]
+
+
+
+- **Repository:** [More Information Needed]
+- **Paper [optional]:** [More Information Needed]
+- **Demo [optional]:** [More Information Needed]
+
+## Uses
+
+
+
+### Direct Use
+
+
+
+[More Information Needed]
+
+### Downstream Use [optional]
+
+
+
+[More Information Needed]
+
+### Out-of-Scope Use
+
+
+
+[More Information Needed]
+
+## Bias, Risks, and Limitations
+
+
+
+[More Information Needed]
+
+### Recommendations
+
+
+
+Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
+
+## How to Get Started with the Model
+
+Use the code below to get started with the model.
+
+[More Information Needed]
+
+## Training Details
+
+### Training Data
+
+
+
+[More Information Needed]
+
+### Training Procedure
+
+
+
+#### Preprocessing [optional]
+
+[More Information Needed]
+
+
+#### Training Hyperparameters
+
+- **Training regime:** [More Information Needed]
+
+#### Speeds, Sizes, Times [optional]
+
+
+
+[More Information Needed]
+
+## Evaluation
+
+
+
+### Testing Data, Factors & Metrics
+
+#### Testing Data
+
+
+
+[More Information Needed]
+
+#### Factors
+
+
+
+[More Information Needed]
+
+#### Metrics
+
+
+
+[More Information Needed]
+
+### Results
+
+[More Information Needed]
+
+#### Summary
+
+
+
+## Model Examination [optional]
+
+
+
+[More Information Needed]
+
+## Environmental Impact
+
+
+
+Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
+
+- **Hardware Type:** [More Information Needed]
+- **Hours used:** [More Information Needed]
+- **Cloud Provider:** [More Information Needed]
+- **Compute Region:** [More Information Needed]
+- **Carbon Emitted:** [More Information Needed]
+
+## Technical Specifications [optional]
+
+### Model Architecture and Objective
+
+[More Information Needed]
+
+### Compute Infrastructure
+
+[More Information Needed]
+
+#### Hardware
+
+[More Information Needed]
+
+#### Software
+
+[More Information Needed]
+
+## Citation [optional]
+
+
+
+**BibTeX:**
+
+[More Information Needed]
+
+**APA:**
+
+[More Information Needed]
+
+## Glossary [optional]
+
+
+
+[More Information Needed]
+
+## More Information [optional]
+
+[More Information Needed]
+
+## Model Card Authors [optional]
+
+[More Information Needed]
+
+## Model Card Contact
+
+[More Information Needed]
+### Framework versions
+
+- PEFT 0.14.0
\ No newline at end of file
diff --git a/checkpoint-170/adapter_config.json b/checkpoint-170/adapter_config.json
new file mode 100644
index 0000000000000000000000000000000000000000..9dfb3ab60881d002c4cdbcc157a93958018fe683
--- /dev/null
+++ b/checkpoint-170/adapter_config.json
@@ -0,0 +1,40 @@
+{
+ "alpha_pattern": {},
+ "auto_mapping": null,
+ "base_model_name_or_path": "meta-llama/Llama-3.1-8B-Instruct",
+ "bias": "none",
+ "eva_config": null,
+ "exclude_modules": null,
+ "fan_in_fan_out": null,
+ "inference_mode": true,
+ "init_lora_weights": true,
+ "layer_replication": null,
+ "layers_pattern": null,
+ "layers_to_transform": null,
+ "loftq_config": {},
+ "lora_alpha": 512,
+ "lora_bias": false,
+ "lora_dropout": 0.05,
+ "megatron_config": null,
+ "megatron_core": "megatron.core",
+ "modules_to_save": [
+ "embed_tokens",
+ "lm_head"
+ ],
+ "peft_type": "LORA",
+ "r": 256,
+ "rank_pattern": {},
+ "revision": null,
+ "target_modules": [
+ "v_proj",
+ "up_proj",
+ "q_proj",
+ "o_proj",
+ "down_proj",
+ "gate_proj",
+ "k_proj"
+ ],
+ "task_type": "CAUSAL_LM",
+ "use_dora": false,
+ "use_rslora": false
+}
\ No newline at end of file
diff --git a/checkpoint-170/adapter_model.safetensors b/checkpoint-170/adapter_model.safetensors
new file mode 100644
index 0000000000000000000000000000000000000000..6a0d81af0bbb6f54e6c27b94e8c26b7137005d68
--- /dev/null
+++ b/checkpoint-170/adapter_model.safetensors
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:6f00477adbbc91cdbe3506bb315564672f3339a25a3f76cb39707f5d9e1b7816
+size 3443586272
diff --git a/checkpoint-170/global_step169/bf16_zero_pp_rank_0_mp_rank_00_optim_states.pt b/checkpoint-170/global_step169/bf16_zero_pp_rank_0_mp_rank_00_optim_states.pt
new file mode 100644
index 0000000000000000000000000000000000000000..21d3cf038a79f37ca4c1346bbee06b44cef3f420
--- /dev/null
+++ b/checkpoint-170/global_step169/bf16_zero_pp_rank_0_mp_rank_00_optim_states.pt
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:d7e377ac5d8b7c24cd88b6b2dbc08b120ffabb7a4f4425aac05fbe8476ab513c
+size 20661195036
diff --git a/checkpoint-170/global_step169/mp_rank_00_model_states.pt b/checkpoint-170/global_step169/mp_rank_00_model_states.pt
new file mode 100644
index 0000000000000000000000000000000000000000..bd4bedb71a9d748afbe6337cc4e72b747cf46a9b
--- /dev/null
+++ b/checkpoint-170/global_step169/mp_rank_00_model_states.pt
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:354622ff359badb672cd028b100aafa94c1f3045f47dc42afb46534b9c283faa
+size 3555326649
diff --git a/checkpoint-170/latest b/checkpoint-170/latest
new file mode 100644
index 0000000000000000000000000000000000000000..4329ff92313231350556fa32048069b4a39003ca
--- /dev/null
+++ b/checkpoint-170/latest
@@ -0,0 +1 @@
+global_step169
\ No newline at end of file
diff --git a/checkpoint-170/rng_state.pth b/checkpoint-170/rng_state.pth
new file mode 100644
index 0000000000000000000000000000000000000000..cb5a84d4c8a15bc0a250ac6fdd2498b740a4224b
--- /dev/null
+++ b/checkpoint-170/rng_state.pth
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:eed6a8aa9a82114bd5c5b23ebe6ce439bc3e872c581ac59770f996845c35964d
+size 14244
diff --git a/checkpoint-170/scheduler.pt b/checkpoint-170/scheduler.pt
new file mode 100644
index 0000000000000000000000000000000000000000..85e83dd9dd4f14de3eeb680a6979427c00bad124
--- /dev/null
+++ b/checkpoint-170/scheduler.pt
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:41efc04e78ebb59be462077f5158c941eb2354b4e3f33dd798a2bde7e4cacc3f
+size 1064
diff --git a/checkpoint-170/special_tokens_map.json b/checkpoint-170/special_tokens_map.json
new file mode 100644
index 0000000000000000000000000000000000000000..278b7f0f84be865c4687700ee7b3c63d89a51e18
--- /dev/null
+++ b/checkpoint-170/special_tokens_map.json
@@ -0,0 +1,23 @@
+{
+ "bos_token": {
+ "content": "<|begin_of_text|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false
+ },
+ "eos_token": {
+ "content": "<|eot_id|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false
+ },
+ "pad_token": {
+ "content": "<|end_of_text|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false
+ }
+}
diff --git a/checkpoint-170/tokenizer.json b/checkpoint-170/tokenizer.json
new file mode 100644
index 0000000000000000000000000000000000000000..1c1d8d5c9024994f1d3b00f9662b8dd89ca13cf2
--- /dev/null
+++ b/checkpoint-170/tokenizer.json
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:6b9e4e7fb171f92fd137b777cc2714bf87d11576700a1dcd7a399e7bbe39537b
+size 17209920
diff --git a/checkpoint-170/tokenizer_config.json b/checkpoint-170/tokenizer_config.json
new file mode 100644
index 0000000000000000000000000000000000000000..ca91a2ef55f4239a7af81d7c9abb05f53621a07b
--- /dev/null
+++ b/checkpoint-170/tokenizer_config.json
@@ -0,0 +1,2064 @@
+{
+ "added_tokens_decoder": {
+ "128000": {
+ "content": "<|begin_of_text|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128001": {
+ "content": "<|end_of_text|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128002": {
+ "content": "<|reserved_special_token_0|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128003": {
+ "content": "<|reserved_special_token_1|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128004": {
+ "content": "<|finetune_right_pad_id|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128005": {
+ "content": "<|reserved_special_token_2|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128006": {
+ "content": "<|start_header_id|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128007": {
+ "content": "<|end_header_id|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128008": {
+ "content": "<|eom_id|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128009": {
+ "content": "<|eot_id|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128010": {
+ "content": "<|python_tag|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128011": {
+ "content": "<|reserved_special_token_3|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128012": {
+ "content": "<|reserved_special_token_4|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128013": {
+ "content": "<|reserved_special_token_5|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128014": {
+ "content": "<|reserved_special_token_6|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128015": {
+ "content": "<|reserved_special_token_7|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128016": {
+ "content": "<|reserved_special_token_8|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128017": {
+ "content": "<|reserved_special_token_9|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128018": {
+ "content": "<|reserved_special_token_10|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128019": {
+ "content": "<|reserved_special_token_11|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128020": {
+ "content": "<|reserved_special_token_12|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128021": {
+ "content": "<|reserved_special_token_13|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128022": {
+ "content": "<|reserved_special_token_14|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128023": {
+ "content": "<|reserved_special_token_15|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128024": {
+ "content": "<|reserved_special_token_16|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128025": {
+ "content": "<|reserved_special_token_17|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128026": {
+ "content": "<|reserved_special_token_18|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128027": {
+ "content": "<|reserved_special_token_19|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128028": {
+ "content": "<|reserved_special_token_20|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128029": {
+ "content": "<|reserved_special_token_21|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128030": {
+ "content": "<|reserved_special_token_22|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128031": {
+ "content": "<|reserved_special_token_23|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128032": {
+ "content": "<|reserved_special_token_24|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128033": {
+ "content": "<|reserved_special_token_25|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128034": {
+ "content": "<|reserved_special_token_26|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128035": {
+ "content": "<|reserved_special_token_27|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128036": {
+ "content": "<|reserved_special_token_28|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128037": {
+ "content": "<|reserved_special_token_29|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128038": {
+ "content": "<|reserved_special_token_30|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128039": {
+ "content": "<|reserved_special_token_31|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128040": {
+ "content": "<|reserved_special_token_32|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128041": {
+ "content": "<|reserved_special_token_33|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128042": {
+ "content": "<|reserved_special_token_34|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128043": {
+ "content": "<|reserved_special_token_35|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128044": {
+ "content": "<|reserved_special_token_36|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128045": {
+ "content": "<|reserved_special_token_37|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128046": {
+ "content": "<|reserved_special_token_38|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128047": {
+ "content": "<|reserved_special_token_39|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128048": {
+ "content": "<|reserved_special_token_40|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128049": {
+ "content": "<|reserved_special_token_41|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128050": {
+ "content": "<|reserved_special_token_42|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128051": {
+ "content": "<|reserved_special_token_43|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128052": {
+ "content": "<|reserved_special_token_44|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128053": {
+ "content": "<|reserved_special_token_45|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128054": {
+ "content": "<|reserved_special_token_46|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128055": {
+ "content": "<|reserved_special_token_47|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128056": {
+ "content": "<|reserved_special_token_48|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128057": {
+ "content": "<|reserved_special_token_49|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128058": {
+ "content": "<|reserved_special_token_50|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128059": {
+ "content": "<|reserved_special_token_51|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128060": {
+ "content": "<|reserved_special_token_52|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128061": {
+ "content": "<|reserved_special_token_53|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128062": {
+ "content": "<|reserved_special_token_54|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128063": {
+ "content": "<|reserved_special_token_55|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128064": {
+ "content": "<|reserved_special_token_56|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128065": {
+ "content": "<|reserved_special_token_57|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128066": {
+ "content": "<|reserved_special_token_58|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128067": {
+ "content": "<|reserved_special_token_59|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128068": {
+ "content": "<|reserved_special_token_60|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128069": {
+ "content": "<|reserved_special_token_61|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128070": {
+ "content": "<|reserved_special_token_62|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128071": {
+ "content": "<|reserved_special_token_63|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128072": {
+ "content": "<|reserved_special_token_64|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128073": {
+ "content": "<|reserved_special_token_65|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128074": {
+ "content": "<|reserved_special_token_66|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128075": {
+ "content": "<|reserved_special_token_67|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128076": {
+ "content": "<|reserved_special_token_68|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128077": {
+ "content": "<|reserved_special_token_69|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128078": {
+ "content": "<|reserved_special_token_70|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128079": {
+ "content": "<|reserved_special_token_71|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128080": {
+ "content": "<|reserved_special_token_72|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128081": {
+ "content": "<|reserved_special_token_73|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128082": {
+ "content": "<|reserved_special_token_74|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128083": {
+ "content": "<|reserved_special_token_75|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128084": {
+ "content": "<|reserved_special_token_76|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128085": {
+ "content": "<|reserved_special_token_77|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128086": {
+ "content": "<|reserved_special_token_78|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128087": {
+ "content": "<|reserved_special_token_79|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128088": {
+ "content": "<|reserved_special_token_80|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128089": {
+ "content": "<|reserved_special_token_81|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128090": {
+ "content": "<|reserved_special_token_82|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128091": {
+ "content": "<|reserved_special_token_83|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128092": {
+ "content": "<|reserved_special_token_84|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128093": {
+ "content": "<|reserved_special_token_85|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128094": {
+ "content": "<|reserved_special_token_86|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128095": {
+ "content": "<|reserved_special_token_87|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128096": {
+ "content": "<|reserved_special_token_88|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128097": {
+ "content": "<|reserved_special_token_89|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128098": {
+ "content": "<|reserved_special_token_90|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128099": {
+ "content": "<|reserved_special_token_91|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128100": {
+ "content": "<|reserved_special_token_92|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128101": {
+ "content": "<|reserved_special_token_93|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128102": {
+ "content": "<|reserved_special_token_94|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128103": {
+ "content": "<|reserved_special_token_95|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128104": {
+ "content": "<|reserved_special_token_96|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128105": {
+ "content": "<|reserved_special_token_97|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128106": {
+ "content": "<|reserved_special_token_98|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128107": {
+ "content": "<|reserved_special_token_99|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128108": {
+ "content": "<|reserved_special_token_100|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128109": {
+ "content": "<|reserved_special_token_101|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128110": {
+ "content": "<|reserved_special_token_102|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128111": {
+ "content": "<|reserved_special_token_103|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128112": {
+ "content": "<|reserved_special_token_104|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128113": {
+ "content": "<|reserved_special_token_105|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128114": {
+ "content": "<|reserved_special_token_106|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128115": {
+ "content": "<|reserved_special_token_107|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128116": {
+ "content": "<|reserved_special_token_108|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128117": {
+ "content": "<|reserved_special_token_109|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128118": {
+ "content": "<|reserved_special_token_110|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128119": {
+ "content": "<|reserved_special_token_111|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128120": {
+ "content": "<|reserved_special_token_112|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128121": {
+ "content": "<|reserved_special_token_113|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128122": {
+ "content": "<|reserved_special_token_114|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128123": {
+ "content": "<|reserved_special_token_115|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128124": {
+ "content": "<|reserved_special_token_116|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128125": {
+ "content": "<|reserved_special_token_117|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128126": {
+ "content": "<|reserved_special_token_118|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128127": {
+ "content": "<|reserved_special_token_119|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128128": {
+ "content": "<|reserved_special_token_120|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128129": {
+ "content": "<|reserved_special_token_121|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128130": {
+ "content": "<|reserved_special_token_122|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128131": {
+ "content": "<|reserved_special_token_123|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128132": {
+ "content": "<|reserved_special_token_124|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128133": {
+ "content": "<|reserved_special_token_125|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128134": {
+ "content": "<|reserved_special_token_126|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128135": {
+ "content": "<|reserved_special_token_127|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128136": {
+ "content": "<|reserved_special_token_128|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128137": {
+ "content": "<|reserved_special_token_129|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128138": {
+ "content": "<|reserved_special_token_130|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128139": {
+ "content": "<|reserved_special_token_131|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128140": {
+ "content": "<|reserved_special_token_132|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128141": {
+ "content": "<|reserved_special_token_133|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128142": {
+ "content": "<|reserved_special_token_134|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128143": {
+ "content": "<|reserved_special_token_135|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128144": {
+ "content": "<|reserved_special_token_136|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128145": {
+ "content": "<|reserved_special_token_137|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128146": {
+ "content": "<|reserved_special_token_138|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128147": {
+ "content": "<|reserved_special_token_139|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128148": {
+ "content": "<|reserved_special_token_140|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128149": {
+ "content": "<|reserved_special_token_141|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128150": {
+ "content": "<|reserved_special_token_142|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128151": {
+ "content": "<|reserved_special_token_143|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128152": {
+ "content": "<|reserved_special_token_144|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128153": {
+ "content": "<|reserved_special_token_145|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128154": {
+ "content": "<|reserved_special_token_146|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128155": {
+ "content": "<|reserved_special_token_147|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128156": {
+ "content": "<|reserved_special_token_148|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128157": {
+ "content": "<|reserved_special_token_149|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128158": {
+ "content": "<|reserved_special_token_150|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128159": {
+ "content": "<|reserved_special_token_151|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128160": {
+ "content": "<|reserved_special_token_152|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128161": {
+ "content": "<|reserved_special_token_153|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128162": {
+ "content": "<|reserved_special_token_154|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128163": {
+ "content": "<|reserved_special_token_155|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128164": {
+ "content": "<|reserved_special_token_156|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128165": {
+ "content": "<|reserved_special_token_157|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128166": {
+ "content": "<|reserved_special_token_158|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128167": {
+ "content": "<|reserved_special_token_159|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128168": {
+ "content": "<|reserved_special_token_160|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128169": {
+ "content": "<|reserved_special_token_161|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128170": {
+ "content": "<|reserved_special_token_162|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128171": {
+ "content": "<|reserved_special_token_163|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128172": {
+ "content": "<|reserved_special_token_164|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128173": {
+ "content": "<|reserved_special_token_165|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128174": {
+ "content": "<|reserved_special_token_166|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128175": {
+ "content": "<|reserved_special_token_167|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128176": {
+ "content": "<|reserved_special_token_168|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128177": {
+ "content": "<|reserved_special_token_169|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128178": {
+ "content": "<|reserved_special_token_170|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128179": {
+ "content": "<|reserved_special_token_171|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128180": {
+ "content": "<|reserved_special_token_172|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128181": {
+ "content": "<|reserved_special_token_173|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128182": {
+ "content": "<|reserved_special_token_174|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128183": {
+ "content": "<|reserved_special_token_175|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128184": {
+ "content": "<|reserved_special_token_176|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128185": {
+ "content": "<|reserved_special_token_177|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128186": {
+ "content": "<|reserved_special_token_178|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128187": {
+ "content": "<|reserved_special_token_179|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128188": {
+ "content": "<|reserved_special_token_180|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128189": {
+ "content": "<|reserved_special_token_181|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128190": {
+ "content": "<|reserved_special_token_182|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128191": {
+ "content": "<|reserved_special_token_183|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128192": {
+ "content": "<|reserved_special_token_184|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128193": {
+ "content": "<|reserved_special_token_185|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128194": {
+ "content": "<|reserved_special_token_186|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128195": {
+ "content": "<|reserved_special_token_187|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128196": {
+ "content": "<|reserved_special_token_188|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128197": {
+ "content": "<|reserved_special_token_189|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128198": {
+ "content": "<|reserved_special_token_190|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128199": {
+ "content": "<|reserved_special_token_191|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128200": {
+ "content": "<|reserved_special_token_192|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128201": {
+ "content": "<|reserved_special_token_193|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128202": {
+ "content": "<|reserved_special_token_194|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128203": {
+ "content": "<|reserved_special_token_195|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128204": {
+ "content": "<|reserved_special_token_196|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128205": {
+ "content": "<|reserved_special_token_197|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128206": {
+ "content": "<|reserved_special_token_198|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128207": {
+ "content": "<|reserved_special_token_199|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128208": {
+ "content": "<|reserved_special_token_200|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128209": {
+ "content": "<|reserved_special_token_201|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128210": {
+ "content": "<|reserved_special_token_202|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128211": {
+ "content": "<|reserved_special_token_203|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128212": {
+ "content": "<|reserved_special_token_204|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128213": {
+ "content": "<|reserved_special_token_205|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128214": {
+ "content": "<|reserved_special_token_206|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128215": {
+ "content": "<|reserved_special_token_207|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128216": {
+ "content": "<|reserved_special_token_208|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128217": {
+ "content": "<|reserved_special_token_209|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128218": {
+ "content": "<|reserved_special_token_210|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128219": {
+ "content": "<|reserved_special_token_211|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128220": {
+ "content": "<|reserved_special_token_212|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128221": {
+ "content": "<|reserved_special_token_213|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128222": {
+ "content": "<|reserved_special_token_214|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128223": {
+ "content": "<|reserved_special_token_215|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128224": {
+ "content": "<|reserved_special_token_216|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128225": {
+ "content": "<|reserved_special_token_217|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128226": {
+ "content": "<|reserved_special_token_218|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128227": {
+ "content": "<|reserved_special_token_219|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128228": {
+ "content": "<|reserved_special_token_220|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128229": {
+ "content": "<|reserved_special_token_221|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128230": {
+ "content": "<|reserved_special_token_222|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128231": {
+ "content": "<|reserved_special_token_223|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128232": {
+ "content": "<|reserved_special_token_224|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128233": {
+ "content": "<|reserved_special_token_225|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128234": {
+ "content": "<|reserved_special_token_226|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128235": {
+ "content": "<|reserved_special_token_227|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128236": {
+ "content": "<|reserved_special_token_228|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128237": {
+ "content": "<|reserved_special_token_229|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128238": {
+ "content": "<|reserved_special_token_230|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128239": {
+ "content": "<|reserved_special_token_231|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128240": {
+ "content": "<|reserved_special_token_232|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128241": {
+ "content": "<|reserved_special_token_233|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128242": {
+ "content": "<|reserved_special_token_234|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128243": {
+ "content": "<|reserved_special_token_235|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128244": {
+ "content": "<|reserved_special_token_236|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128245": {
+ "content": "<|reserved_special_token_237|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128246": {
+ "content": "<|reserved_special_token_238|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128247": {
+ "content": "<|reserved_special_token_239|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128248": {
+ "content": "<|reserved_special_token_240|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128249": {
+ "content": "<|reserved_special_token_241|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128250": {
+ "content": "<|reserved_special_token_242|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128251": {
+ "content": "<|reserved_special_token_243|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128252": {
+ "content": "<|reserved_special_token_244|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128253": {
+ "content": "<|reserved_special_token_245|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128254": {
+ "content": "<|reserved_special_token_246|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128255": {
+ "content": "<|reserved_special_token_247|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ }
+ },
+ "bos_token": "<|begin_of_text|>",
+ "chat_template": "{{- bos_token }}\n{%- if custom_tools is defined %}\n {%- set tools = custom_tools %}\n{%- endif %}\n{%- if not tools_in_user_message is defined %}\n {%- set tools_in_user_message = true %}\n{%- endif %}\n{%- if not date_string is defined %}\n {%- set date_string = \"26 Jul 2024\" %}\n{%- endif %}\n{%- if not tools is defined %}\n {%- set tools = none %}\n{%- endif %}\n\n{#- This block extracts the system message, so we can slot it into the right place. #}\n{%- if messages[0]['role'] == 'system' %}\n {%- set system_message = messages[0]['content']|trim %}\n {%- set messages = messages[1:] %}\n{%- else %}\n {%- set system_message = \"\" %}\n{%- endif %}\n\n{#- System message + builtin tools #}\n{{- \"<|start_header_id|>system<|end_header_id|>\\n\\n\" }}\n{%- if builtin_tools is defined or tools is not none %}\n {{- \"Environment: ipython\\n\" }}\n{%- endif %}\n{%- if builtin_tools is defined %}\n {{- \"Tools: \" + builtin_tools | reject('equalto', 'code_interpreter') | join(\", \") + \"\\n\\n\"}}\n{%- endif %}\n{{- \"Cutting Knowledge Date: December 2023\\n\" }}\n{{- \"Today Date: \" + date_string + \"\\n\\n\" }}\n{%- if tools is not none and not tools_in_user_message %}\n {{- \"You have access to the following functions. To call a function, please respond with JSON for a function call.\" }}\n {{- 'Respond in the format {\"name\": function name, \"parameters\": dictionary of argument name and its value}.' }}\n {{- \"Do not use variables.\\n\\n\" }}\n {%- for t in tools %}\n {{- t | tojson(indent=4) }}\n {{- \"\\n\\n\" }}\n {%- endfor %}\n{%- endif %}\n{{- system_message }}\n{{- \"<|eot_id|>\" }}\n\n{#- Custom tools are passed in a user message with some extra guidance #}\n{%- if tools_in_user_message and not tools is none %}\n {#- Extract the first user message so we can plug it in here #}\n {%- if messages | length != 0 %}\n {%- set first_user_message = messages[0]['content']|trim %}\n {%- set messages = messages[1:] %}\n {%- else %}\n {{- raise_exception(\"Cannot put tools in the first user message when there's no first user message!\") }}\n{%- endif %}\n {{- '<|start_header_id|>user<|end_header_id|>\\n\\n' -}}\n {{- \"Given the following functions, please respond with a JSON for a function call \" }}\n {{- \"with its proper arguments that best answers the given prompt.\\n\\n\" }}\n {{- 'Respond in the format {\"name\": function name, \"parameters\": dictionary of argument name and its value}.' }}\n {{- \"Do not use variables.\\n\\n\" }}\n {%- for t in tools %}\n {{- t | tojson(indent=4) }}\n {{- \"\\n\\n\" }}\n {%- endfor %}\n {{- first_user_message + \"<|eot_id|>\"}}\n{%- endif %}\n\n{%- for message in messages %}\n {%- if not (message.role == 'ipython' or message.role == 'tool' or 'tool_calls' in message) %}\n {{- '<|start_header_id|>' + message['role'] + '<|end_header_id|>\\n\\n'+ message['content'] | trim + '<|eot_id|>' }}\n {%- elif 'tool_calls' in message %}\n {%- if not message.tool_calls|length == 1 %}\n {{- raise_exception(\"This model only supports single tool-calls at once!\") }}\n {%- endif %}\n {%- set tool_call = message.tool_calls[0].function %}\n {%- if builtin_tools is defined and tool_call.name in builtin_tools %}\n {{- '<|start_header_id|>assistant<|end_header_id|>\\n\\n' -}}\n {{- \"<|python_tag|>\" + tool_call.name + \".call(\" }}\n {%- for arg_name, arg_val in tool_call.arguments | items %}\n {{- arg_name + '=\"' + arg_val + '\"' }}\n {%- if not loop.last %}\n {{- \", \" }}\n {%- endif %}\n {%- endfor %}\n {{- \")\" }}\n {%- else %}\n {{- '<|start_header_id|>assistant<|end_header_id|>\\n\\n' -}}\n {{- '{\"name\": \"' + tool_call.name + '\", ' }}\n {{- '\"parameters\": ' }}\n {{- tool_call.arguments | tojson }}\n {{- \"}\" }}\n {%- endif %}\n {%- if builtin_tools is defined %}\n {#- This means we're in ipython mode #}\n {{- \"<|eom_id|>\" }}\n {%- else %}\n {{- \"<|eot_id|>\" }}\n {%- endif %}\n {%- elif message.role == \"tool\" or message.role == \"ipython\" %}\n {{- \"<|start_header_id|>ipython<|end_header_id|>\\n\\n\" }}\n {%- if message.content is mapping or message.content is iterable %}\n {{- message.content | tojson }}\n {%- else %}\n {{- message.content }}\n {%- endif %}\n {{- \"<|eot_id|>\" }}\n {%- endif %}\n{%- endfor %}\n{%- if add_generation_prompt %}\n {{- '<|start_header_id|>assistant<|end_header_id|>\\n\\n' }}\n{%- endif %}\n",
+ "clean_up_tokenization_spaces": true,
+ "eos_token": "<|eot_id|>",
+ "extra_special_tokens": {},
+ "model_input_names": [
+ "input_ids",
+ "attention_mask"
+ ],
+ "model_max_length": 131072,
+ "pad_token": "<|end_of_text|>",
+ "tokenizer_class": "PreTrainedTokenizer"
+}
diff --git a/checkpoint-170/trainer_state.json b/checkpoint-170/trainer_state.json
new file mode 100644
index 0000000000000000000000000000000000000000..a8f4c78aa4224d9d822f3b135b0397902e56e767
--- /dev/null
+++ b/checkpoint-170/trainer_state.json
@@ -0,0 +1,1223 @@
+{
+ "best_metric": null,
+ "best_model_checkpoint": null,
+ "epoch": 1.984375,
+ "eval_steps": 500,
+ "global_step": 170,
+ "is_hyper_param_search": false,
+ "is_local_process_zero": true,
+ "is_world_process_zero": true,
+ "log_history": [
+ {
+ "epoch": 0.01171875,
+ "grad_norm": 36.23282241821289,
+ "learning_rate": 5.0000000000000004e-08,
+ "loss": 2.3839,
+ "step": 1
+ },
+ {
+ "epoch": 0.0234375,
+ "grad_norm": 35.918636322021484,
+ "learning_rate": 1.0000000000000001e-07,
+ "loss": 2.3798,
+ "step": 2
+ },
+ {
+ "epoch": 0.03515625,
+ "grad_norm": 35.62618637084961,
+ "learning_rate": 1.5000000000000002e-07,
+ "loss": 2.386,
+ "step": 3
+ },
+ {
+ "epoch": 0.046875,
+ "grad_norm": 35.966087341308594,
+ "learning_rate": 2.0000000000000002e-07,
+ "loss": 2.3803,
+ "step": 4
+ },
+ {
+ "epoch": 0.05859375,
+ "grad_norm": 35.38177490234375,
+ "learning_rate": 2.5000000000000004e-07,
+ "loss": 2.3937,
+ "step": 5
+ },
+ {
+ "epoch": 0.0703125,
+ "grad_norm": 35.99677658081055,
+ "learning_rate": 3.0000000000000004e-07,
+ "loss": 2.3906,
+ "step": 6
+ },
+ {
+ "epoch": 0.08203125,
+ "grad_norm": 35.44341278076172,
+ "learning_rate": 3.5000000000000004e-07,
+ "loss": 2.3539,
+ "step": 7
+ },
+ {
+ "epoch": 0.09375,
+ "grad_norm": 35.300697326660156,
+ "learning_rate": 4.0000000000000003e-07,
+ "loss": 2.3459,
+ "step": 8
+ },
+ {
+ "epoch": 0.10546875,
+ "grad_norm": 34.092952728271484,
+ "learning_rate": 4.5000000000000003e-07,
+ "loss": 2.2959,
+ "step": 9
+ },
+ {
+ "epoch": 0.1171875,
+ "grad_norm": 34.46371841430664,
+ "learning_rate": 5.000000000000001e-07,
+ "loss": 2.2661,
+ "step": 10
+ },
+ {
+ "epoch": 0.12890625,
+ "grad_norm": 34.62260818481445,
+ "learning_rate": 5.5e-07,
+ "loss": 2.2918,
+ "step": 11
+ },
+ {
+ "epoch": 0.140625,
+ "grad_norm": 33.790374755859375,
+ "learning_rate": 6.000000000000001e-07,
+ "loss": 2.223,
+ "step": 12
+ },
+ {
+ "epoch": 0.15234375,
+ "grad_norm": 33.766536712646484,
+ "learning_rate": 6.5e-07,
+ "loss": 2.2267,
+ "step": 13
+ },
+ {
+ "epoch": 0.1640625,
+ "grad_norm": 33.894081115722656,
+ "learning_rate": 7.000000000000001e-07,
+ "loss": 2.1465,
+ "step": 14
+ },
+ {
+ "epoch": 0.17578125,
+ "grad_norm": 33.162452697753906,
+ "learning_rate": 7.5e-07,
+ "loss": 2.0495,
+ "step": 15
+ },
+ {
+ "epoch": 0.1875,
+ "grad_norm": 32.954341888427734,
+ "learning_rate": 8.000000000000001e-07,
+ "loss": 1.9627,
+ "step": 16
+ },
+ {
+ "epoch": 0.19921875,
+ "grad_norm": 33.96324157714844,
+ "learning_rate": 8.500000000000001e-07,
+ "loss": 1.8867,
+ "step": 17
+ },
+ {
+ "epoch": 0.2109375,
+ "grad_norm": 33.81139373779297,
+ "learning_rate": 9.000000000000001e-07,
+ "loss": 1.7752,
+ "step": 18
+ },
+ {
+ "epoch": 0.22265625,
+ "grad_norm": 34.87086868286133,
+ "learning_rate": 9.500000000000001e-07,
+ "loss": 1.6944,
+ "step": 19
+ },
+ {
+ "epoch": 0.234375,
+ "grad_norm": 34.84965133666992,
+ "learning_rate": 1.0000000000000002e-06,
+ "loss": 1.5707,
+ "step": 20
+ },
+ {
+ "epoch": 0.24609375,
+ "grad_norm": 35.227317810058594,
+ "learning_rate": 1.0500000000000001e-06,
+ "loss": 1.4369,
+ "step": 21
+ },
+ {
+ "epoch": 0.2578125,
+ "grad_norm": 34.91344451904297,
+ "learning_rate": 1.1e-06,
+ "loss": 1.3202,
+ "step": 22
+ },
+ {
+ "epoch": 0.26953125,
+ "grad_norm": 31.7376766204834,
+ "learning_rate": 1.1500000000000002e-06,
+ "loss": 1.1398,
+ "step": 23
+ },
+ {
+ "epoch": 0.28125,
+ "grad_norm": 30.24741554260254,
+ "learning_rate": 1.2000000000000002e-06,
+ "loss": 1.0421,
+ "step": 24
+ },
+ {
+ "epoch": 0.29296875,
+ "grad_norm": 28.292400360107422,
+ "learning_rate": 1.25e-06,
+ "loss": 0.8817,
+ "step": 25
+ },
+ {
+ "epoch": 0.3046875,
+ "grad_norm": 30.44672393798828,
+ "learning_rate": 1.3e-06,
+ "loss": 0.7073,
+ "step": 26
+ },
+ {
+ "epoch": 0.31640625,
+ "grad_norm": 29.416427612304688,
+ "learning_rate": 1.3500000000000002e-06,
+ "loss": 0.5444,
+ "step": 27
+ },
+ {
+ "epoch": 0.328125,
+ "grad_norm": 24.820096969604492,
+ "learning_rate": 1.4000000000000001e-06,
+ "loss": 0.4025,
+ "step": 28
+ },
+ {
+ "epoch": 0.33984375,
+ "grad_norm": 21.023277282714844,
+ "learning_rate": 1.45e-06,
+ "loss": 0.307,
+ "step": 29
+ },
+ {
+ "epoch": 0.3515625,
+ "grad_norm": 19.656967163085938,
+ "learning_rate": 1.5e-06,
+ "loss": 0.2151,
+ "step": 30
+ },
+ {
+ "epoch": 0.36328125,
+ "grad_norm": 14.91929817199707,
+ "learning_rate": 1.5500000000000002e-06,
+ "loss": 0.1448,
+ "step": 31
+ },
+ {
+ "epoch": 0.375,
+ "grad_norm": 5.083199977874756,
+ "learning_rate": 1.6000000000000001e-06,
+ "loss": 0.09,
+ "step": 32
+ },
+ {
+ "epoch": 0.38671875,
+ "grad_norm": 2.320681571960449,
+ "learning_rate": 1.6500000000000003e-06,
+ "loss": 0.0641,
+ "step": 33
+ },
+ {
+ "epoch": 0.3984375,
+ "grad_norm": 1.6233159303665161,
+ "learning_rate": 1.7000000000000002e-06,
+ "loss": 0.0584,
+ "step": 34
+ },
+ {
+ "epoch": 0.41015625,
+ "grad_norm": 1.6057201623916626,
+ "learning_rate": 1.75e-06,
+ "loss": 0.0626,
+ "step": 35
+ },
+ {
+ "epoch": 0.421875,
+ "grad_norm": 1.8360320329666138,
+ "learning_rate": 1.8000000000000001e-06,
+ "loss": 0.0563,
+ "step": 36
+ },
+ {
+ "epoch": 0.43359375,
+ "grad_norm": 1.736350178718567,
+ "learning_rate": 1.85e-06,
+ "loss": 0.0609,
+ "step": 37
+ },
+ {
+ "epoch": 0.4453125,
+ "grad_norm": 1.1473922729492188,
+ "learning_rate": 1.9000000000000002e-06,
+ "loss": 0.0541,
+ "step": 38
+ },
+ {
+ "epoch": 0.45703125,
+ "grad_norm": 1.1722168922424316,
+ "learning_rate": 1.9500000000000004e-06,
+ "loss": 0.0534,
+ "step": 39
+ },
+ {
+ "epoch": 0.46875,
+ "grad_norm": 1.356987714767456,
+ "learning_rate": 2.0000000000000003e-06,
+ "loss": 0.0496,
+ "step": 40
+ },
+ {
+ "epoch": 0.48046875,
+ "grad_norm": 0.8023216724395752,
+ "learning_rate": 2.05e-06,
+ "loss": 0.0527,
+ "step": 41
+ },
+ {
+ "epoch": 0.4921875,
+ "grad_norm": 0.9803515672683716,
+ "learning_rate": 2.1000000000000002e-06,
+ "loss": 0.0478,
+ "step": 42
+ },
+ {
+ "epoch": 0.50390625,
+ "grad_norm": 0.8733468651771545,
+ "learning_rate": 2.15e-06,
+ "loss": 0.052,
+ "step": 43
+ },
+ {
+ "epoch": 0.515625,
+ "grad_norm": 0.8213743567466736,
+ "learning_rate": 2.2e-06,
+ "loss": 0.0448,
+ "step": 44
+ },
+ {
+ "epoch": 0.52734375,
+ "grad_norm": 0.843189537525177,
+ "learning_rate": 2.25e-06,
+ "loss": 0.0498,
+ "step": 45
+ },
+ {
+ "epoch": 0.5390625,
+ "grad_norm": 0.8801079392433167,
+ "learning_rate": 2.3000000000000004e-06,
+ "loss": 0.0408,
+ "step": 46
+ },
+ {
+ "epoch": 0.55078125,
+ "grad_norm": 0.7131401300430298,
+ "learning_rate": 2.35e-06,
+ "loss": 0.0405,
+ "step": 47
+ },
+ {
+ "epoch": 0.5625,
+ "grad_norm": 0.8996126651763916,
+ "learning_rate": 2.4000000000000003e-06,
+ "loss": 0.0525,
+ "step": 48
+ },
+ {
+ "epoch": 0.57421875,
+ "grad_norm": 0.8606986403465271,
+ "learning_rate": 2.4500000000000003e-06,
+ "loss": 0.0438,
+ "step": 49
+ },
+ {
+ "epoch": 0.5859375,
+ "grad_norm": 0.6918051838874817,
+ "learning_rate": 2.5e-06,
+ "loss": 0.0394,
+ "step": 50
+ },
+ {
+ "epoch": 0.59765625,
+ "grad_norm": 0.6177802085876465,
+ "learning_rate": 2.55e-06,
+ "loss": 0.0387,
+ "step": 51
+ },
+ {
+ "epoch": 0.609375,
+ "grad_norm": 0.7042555809020996,
+ "learning_rate": 2.6e-06,
+ "loss": 0.0434,
+ "step": 52
+ },
+ {
+ "epoch": 0.62109375,
+ "grad_norm": 0.6537717580795288,
+ "learning_rate": 2.6500000000000005e-06,
+ "loss": 0.0396,
+ "step": 53
+ },
+ {
+ "epoch": 0.6328125,
+ "grad_norm": 0.7834082841873169,
+ "learning_rate": 2.7000000000000004e-06,
+ "loss": 0.0411,
+ "step": 54
+ },
+ {
+ "epoch": 0.64453125,
+ "grad_norm": 0.7287272810935974,
+ "learning_rate": 2.7500000000000004e-06,
+ "loss": 0.0408,
+ "step": 55
+ },
+ {
+ "epoch": 0.65625,
+ "grad_norm": 0.7186263203620911,
+ "learning_rate": 2.8000000000000003e-06,
+ "loss": 0.0394,
+ "step": 56
+ },
+ {
+ "epoch": 0.66796875,
+ "grad_norm": 0.7264899611473083,
+ "learning_rate": 2.85e-06,
+ "loss": 0.0427,
+ "step": 57
+ },
+ {
+ "epoch": 0.6796875,
+ "grad_norm": 0.7665618062019348,
+ "learning_rate": 2.9e-06,
+ "loss": 0.0368,
+ "step": 58
+ },
+ {
+ "epoch": 0.69140625,
+ "grad_norm": 0.7222962379455566,
+ "learning_rate": 2.95e-06,
+ "loss": 0.0412,
+ "step": 59
+ },
+ {
+ "epoch": 0.703125,
+ "grad_norm": 0.7061101794242859,
+ "learning_rate": 3e-06,
+ "loss": 0.0377,
+ "step": 60
+ },
+ {
+ "epoch": 0.71484375,
+ "grad_norm": 0.5724324584007263,
+ "learning_rate": 3.05e-06,
+ "loss": 0.0387,
+ "step": 61
+ },
+ {
+ "epoch": 0.7265625,
+ "grad_norm": 0.5535506010055542,
+ "learning_rate": 3.1000000000000004e-06,
+ "loss": 0.0403,
+ "step": 62
+ },
+ {
+ "epoch": 0.73828125,
+ "grad_norm": 0.6553678512573242,
+ "learning_rate": 3.1500000000000003e-06,
+ "loss": 0.0415,
+ "step": 63
+ },
+ {
+ "epoch": 0.75,
+ "grad_norm": 0.6137285828590393,
+ "learning_rate": 3.2000000000000003e-06,
+ "loss": 0.0383,
+ "step": 64
+ },
+ {
+ "epoch": 0.76171875,
+ "grad_norm": 0.5985754132270813,
+ "learning_rate": 3.2500000000000002e-06,
+ "loss": 0.0355,
+ "step": 65
+ },
+ {
+ "epoch": 0.7734375,
+ "grad_norm": 0.5903909802436829,
+ "learning_rate": 3.3000000000000006e-06,
+ "loss": 0.0374,
+ "step": 66
+ },
+ {
+ "epoch": 0.78515625,
+ "grad_norm": 0.5718765258789062,
+ "learning_rate": 3.3500000000000005e-06,
+ "loss": 0.0339,
+ "step": 67
+ },
+ {
+ "epoch": 0.796875,
+ "grad_norm": 0.6844965815544128,
+ "learning_rate": 3.4000000000000005e-06,
+ "loss": 0.0405,
+ "step": 68
+ },
+ {
+ "epoch": 0.80859375,
+ "grad_norm": 0.5959618091583252,
+ "learning_rate": 3.45e-06,
+ "loss": 0.0338,
+ "step": 69
+ },
+ {
+ "epoch": 0.8203125,
+ "grad_norm": 0.6095123291015625,
+ "learning_rate": 3.5e-06,
+ "loss": 0.0362,
+ "step": 70
+ },
+ {
+ "epoch": 0.83203125,
+ "grad_norm": 0.543708086013794,
+ "learning_rate": 3.5500000000000003e-06,
+ "loss": 0.0355,
+ "step": 71
+ },
+ {
+ "epoch": 0.84375,
+ "grad_norm": 0.6969983577728271,
+ "learning_rate": 3.6000000000000003e-06,
+ "loss": 0.0325,
+ "step": 72
+ },
+ {
+ "epoch": 0.85546875,
+ "grad_norm": 0.6022969484329224,
+ "learning_rate": 3.65e-06,
+ "loss": 0.0342,
+ "step": 73
+ },
+ {
+ "epoch": 0.8671875,
+ "grad_norm": 0.6262147426605225,
+ "learning_rate": 3.7e-06,
+ "loss": 0.0348,
+ "step": 74
+ },
+ {
+ "epoch": 0.87890625,
+ "grad_norm": 0.5729933381080627,
+ "learning_rate": 3.7500000000000005e-06,
+ "loss": 0.0318,
+ "step": 75
+ },
+ {
+ "epoch": 0.890625,
+ "grad_norm": 0.5846775770187378,
+ "learning_rate": 3.8000000000000005e-06,
+ "loss": 0.0309,
+ "step": 76
+ },
+ {
+ "epoch": 0.90234375,
+ "grad_norm": 0.6469219923019409,
+ "learning_rate": 3.85e-06,
+ "loss": 0.0324,
+ "step": 77
+ },
+ {
+ "epoch": 0.9140625,
+ "grad_norm": 0.6574859023094177,
+ "learning_rate": 3.900000000000001e-06,
+ "loss": 0.0325,
+ "step": 78
+ },
+ {
+ "epoch": 0.92578125,
+ "grad_norm": 0.5833832025527954,
+ "learning_rate": 3.95e-06,
+ "loss": 0.0232,
+ "step": 79
+ },
+ {
+ "epoch": 0.9375,
+ "grad_norm": 0.7503570318222046,
+ "learning_rate": 4.000000000000001e-06,
+ "loss": 0.0267,
+ "step": 80
+ },
+ {
+ "epoch": 0.94921875,
+ "grad_norm": 0.7181633114814758,
+ "learning_rate": 4.05e-06,
+ "loss": 0.0304,
+ "step": 81
+ },
+ {
+ "epoch": 0.9609375,
+ "grad_norm": 0.6477274298667908,
+ "learning_rate": 4.1e-06,
+ "loss": 0.0297,
+ "step": 82
+ },
+ {
+ "epoch": 0.97265625,
+ "grad_norm": 0.6768563389778137,
+ "learning_rate": 4.15e-06,
+ "loss": 0.0279,
+ "step": 83
+ },
+ {
+ "epoch": 0.984375,
+ "grad_norm": 0.7905837297439575,
+ "learning_rate": 4.2000000000000004e-06,
+ "loss": 0.0301,
+ "step": 84
+ },
+ {
+ "epoch": 0.99609375,
+ "grad_norm": 0.5576608777046204,
+ "learning_rate": 4.25e-06,
+ "loss": 0.0322,
+ "step": 85
+ },
+ {
+ "epoch": 1.0,
+ "grad_norm": 0.5576608777046204,
+ "learning_rate": 4.3e-06,
+ "loss": 0.0226,
+ "step": 86
+ },
+ {
+ "epoch": 1.01171875,
+ "grad_norm": 1.0774812698364258,
+ "learning_rate": 4.350000000000001e-06,
+ "loss": 0.0215,
+ "step": 87
+ },
+ {
+ "epoch": 1.0234375,
+ "grad_norm": 0.47373324632644653,
+ "learning_rate": 4.4e-06,
+ "loss": 0.0235,
+ "step": 88
+ },
+ {
+ "epoch": 1.03515625,
+ "grad_norm": 0.7665970325469971,
+ "learning_rate": 4.450000000000001e-06,
+ "loss": 0.0242,
+ "step": 89
+ },
+ {
+ "epoch": 1.046875,
+ "grad_norm": 0.6290147304534912,
+ "learning_rate": 4.5e-06,
+ "loss": 0.0209,
+ "step": 90
+ },
+ {
+ "epoch": 1.05859375,
+ "grad_norm": 0.5703024864196777,
+ "learning_rate": 4.5500000000000005e-06,
+ "loss": 0.0192,
+ "step": 91
+ },
+ {
+ "epoch": 1.0703125,
+ "grad_norm": 0.6099259853363037,
+ "learning_rate": 4.600000000000001e-06,
+ "loss": 0.0181,
+ "step": 92
+ },
+ {
+ "epoch": 1.08203125,
+ "grad_norm": 0.6570988297462463,
+ "learning_rate": 4.65e-06,
+ "loss": 0.0201,
+ "step": 93
+ },
+ {
+ "epoch": 1.09375,
+ "grad_norm": 0.7848325371742249,
+ "learning_rate": 4.7e-06,
+ "loss": 0.0253,
+ "step": 94
+ },
+ {
+ "epoch": 1.10546875,
+ "grad_norm": 0.6759209036827087,
+ "learning_rate": 4.75e-06,
+ "loss": 0.0195,
+ "step": 95
+ },
+ {
+ "epoch": 1.1171875,
+ "grad_norm": 0.4861151874065399,
+ "learning_rate": 4.800000000000001e-06,
+ "loss": 0.0191,
+ "step": 96
+ },
+ {
+ "epoch": 1.12890625,
+ "grad_norm": 0.6268576383590698,
+ "learning_rate": 4.85e-06,
+ "loss": 0.0211,
+ "step": 97
+ },
+ {
+ "epoch": 1.140625,
+ "grad_norm": 0.5862017869949341,
+ "learning_rate": 4.9000000000000005e-06,
+ "loss": 0.0177,
+ "step": 98
+ },
+ {
+ "epoch": 1.15234375,
+ "grad_norm": 0.4569724202156067,
+ "learning_rate": 4.95e-06,
+ "loss": 0.0164,
+ "step": 99
+ },
+ {
+ "epoch": 1.1640625,
+ "grad_norm": 0.4539048969745636,
+ "learning_rate": 5e-06,
+ "loss": 0.0152,
+ "step": 100
+ },
+ {
+ "epoch": 1.17578125,
+ "grad_norm": 0.4553528428077698,
+ "learning_rate": 4.999926609487568e-06,
+ "loss": 0.0208,
+ "step": 101
+ },
+ {
+ "epoch": 1.1875,
+ "grad_norm": 0.5182592272758484,
+ "learning_rate": 4.999706442259205e-06,
+ "loss": 0.0154,
+ "step": 102
+ },
+ {
+ "epoch": 1.19921875,
+ "grad_norm": 0.5602673888206482,
+ "learning_rate": 4.999339511241458e-06,
+ "loss": 0.0196,
+ "step": 103
+ },
+ {
+ "epoch": 1.2109375,
+ "grad_norm": 0.7579494118690491,
+ "learning_rate": 4.9988258379777334e-06,
+ "loss": 0.0198,
+ "step": 104
+ },
+ {
+ "epoch": 1.22265625,
+ "grad_norm": 0.603757381439209,
+ "learning_rate": 4.998165452627025e-06,
+ "loss": 0.0185,
+ "step": 105
+ },
+ {
+ "epoch": 1.234375,
+ "grad_norm": 0.5520291924476624,
+ "learning_rate": 4.99735839396215e-06,
+ "loss": 0.018,
+ "step": 106
+ },
+ {
+ "epoch": 1.24609375,
+ "grad_norm": 0.55808424949646,
+ "learning_rate": 4.996404709367466e-06,
+ "loss": 0.0159,
+ "step": 107
+ },
+ {
+ "epoch": 1.2578125,
+ "grad_norm": 0.47174298763275146,
+ "learning_rate": 4.995304454836095e-06,
+ "loss": 0.0122,
+ "step": 108
+ },
+ {
+ "epoch": 1.26953125,
+ "grad_norm": 0.5289337038993835,
+ "learning_rate": 4.994057694966632e-06,
+ "loss": 0.0168,
+ "step": 109
+ },
+ {
+ "epoch": 1.28125,
+ "grad_norm": 0.5390430092811584,
+ "learning_rate": 4.992664502959351e-06,
+ "loss": 0.017,
+ "step": 110
+ },
+ {
+ "epoch": 1.29296875,
+ "grad_norm": 0.4966451823711395,
+ "learning_rate": 4.991124960611916e-06,
+ "loss": 0.0145,
+ "step": 111
+ },
+ {
+ "epoch": 1.3046875,
+ "grad_norm": 0.6148604154586792,
+ "learning_rate": 4.989439158314566e-06,
+ "loss": 0.0139,
+ "step": 112
+ },
+ {
+ "epoch": 1.31640625,
+ "grad_norm": 0.6303534507751465,
+ "learning_rate": 4.9876071950448185e-06,
+ "loss": 0.0118,
+ "step": 113
+ },
+ {
+ "epoch": 1.328125,
+ "grad_norm": 0.5410207509994507,
+ "learning_rate": 4.98562917836165e-06,
+ "loss": 0.0094,
+ "step": 114
+ },
+ {
+ "epoch": 1.33984375,
+ "grad_norm": 0.5350080132484436,
+ "learning_rate": 4.983505224399188e-06,
+ "loss": 0.0158,
+ "step": 115
+ },
+ {
+ "epoch": 1.3515625,
+ "grad_norm": 1.017317295074463,
+ "learning_rate": 4.9812354578598876e-06,
+ "loss": 0.0201,
+ "step": 116
+ },
+ {
+ "epoch": 1.36328125,
+ "grad_norm": 0.6891007423400879,
+ "learning_rate": 4.978820012007213e-06,
+ "loss": 0.0127,
+ "step": 117
+ },
+ {
+ "epoch": 1.375,
+ "grad_norm": 0.4756389260292053,
+ "learning_rate": 4.976259028657812e-06,
+ "loss": 0.0188,
+ "step": 118
+ },
+ {
+ "epoch": 1.38671875,
+ "grad_norm": 0.5957350730895996,
+ "learning_rate": 4.973552658173186e-06,
+ "loss": 0.011,
+ "step": 119
+ },
+ {
+ "epoch": 1.3984375,
+ "grad_norm": 0.5012223720550537,
+ "learning_rate": 4.970701059450872e-06,
+ "loss": 0.0138,
+ "step": 120
+ },
+ {
+ "epoch": 1.41015625,
+ "grad_norm": 0.4408419132232666,
+ "learning_rate": 4.9677043999151e-06,
+ "loss": 0.0144,
+ "step": 121
+ },
+ {
+ "epoch": 1.421875,
+ "grad_norm": 0.5721736550331116,
+ "learning_rate": 4.964562855506976e-06,
+ "loss": 0.0135,
+ "step": 122
+ },
+ {
+ "epoch": 1.43359375,
+ "grad_norm": 0.5479208827018738,
+ "learning_rate": 4.961276610674141e-06,
+ "loss": 0.0128,
+ "step": 123
+ },
+ {
+ "epoch": 1.4453125,
+ "grad_norm": 1.0117675065994263,
+ "learning_rate": 4.9578458583599495e-06,
+ "loss": 0.0111,
+ "step": 124
+ },
+ {
+ "epoch": 1.45703125,
+ "grad_norm": 0.5504026412963867,
+ "learning_rate": 4.954270799992138e-06,
+ "loss": 0.0083,
+ "step": 125
+ },
+ {
+ "epoch": 1.46875,
+ "grad_norm": 0.48403099179267883,
+ "learning_rate": 4.950551645470998e-06,
+ "loss": 0.0083,
+ "step": 126
+ },
+ {
+ "epoch": 1.48046875,
+ "grad_norm": 0.6866800785064697,
+ "learning_rate": 4.9466886131570565e-06,
+ "loss": 0.0085,
+ "step": 127
+ },
+ {
+ "epoch": 1.4921875,
+ "grad_norm": 0.872557520866394,
+ "learning_rate": 4.942681929858249e-06,
+ "loss": 0.0102,
+ "step": 128
+ },
+ {
+ "epoch": 1.50390625,
+ "grad_norm": 0.6924716234207153,
+ "learning_rate": 4.9385318308166065e-06,
+ "loss": 0.012,
+ "step": 129
+ },
+ {
+ "epoch": 1.515625,
+ "grad_norm": 0.5060118436813354,
+ "learning_rate": 4.934238559694448e-06,
+ "loss": 0.0084,
+ "step": 130
+ },
+ {
+ "epoch": 1.52734375,
+ "grad_norm": 0.6256171464920044,
+ "learning_rate": 4.929802368560066e-06,
+ "loss": 0.0081,
+ "step": 131
+ },
+ {
+ "epoch": 1.5390625,
+ "grad_norm": 0.5422537922859192,
+ "learning_rate": 4.925223517872934e-06,
+ "loss": 0.0077,
+ "step": 132
+ },
+ {
+ "epoch": 1.55078125,
+ "grad_norm": 0.953416109085083,
+ "learning_rate": 4.920502276468408e-06,
+ "loss": 0.0078,
+ "step": 133
+ },
+ {
+ "epoch": 1.5625,
+ "grad_norm": 0.4540804624557495,
+ "learning_rate": 4.915638921541952e-06,
+ "loss": 0.0097,
+ "step": 134
+ },
+ {
+ "epoch": 1.57421875,
+ "grad_norm": 0.3773641884326935,
+ "learning_rate": 4.9106337386328524e-06,
+ "loss": 0.0098,
+ "step": 135
+ },
+ {
+ "epoch": 1.5859375,
+ "grad_norm": 0.7970175743103027,
+ "learning_rate": 4.905487021607462e-06,
+ "loss": 0.0056,
+ "step": 136
+ },
+ {
+ "epoch": 1.59765625,
+ "grad_norm": 0.45197635889053345,
+ "learning_rate": 4.900199072641937e-06,
+ "loss": 0.0078,
+ "step": 137
+ },
+ {
+ "epoch": 1.609375,
+ "grad_norm": 0.38231438398361206,
+ "learning_rate": 4.894770202204509e-06,
+ "loss": 0.0072,
+ "step": 138
+ },
+ {
+ "epoch": 1.62109375,
+ "grad_norm": 0.2945426404476166,
+ "learning_rate": 4.889200729037241e-06,
+ "loss": 0.0086,
+ "step": 139
+ },
+ {
+ "epoch": 1.6328125,
+ "grad_norm": 0.49699363112449646,
+ "learning_rate": 4.883490980137327e-06,
+ "loss": 0.0073,
+ "step": 140
+ },
+ {
+ "epoch": 1.64453125,
+ "grad_norm": 0.38112956285476685,
+ "learning_rate": 4.8776412907378845e-06,
+ "loss": 0.0056,
+ "step": 141
+ },
+ {
+ "epoch": 1.65625,
+ "grad_norm": 0.46780407428741455,
+ "learning_rate": 4.871652004288275e-06,
+ "loss": 0.0078,
+ "step": 142
+ },
+ {
+ "epoch": 1.66796875,
+ "grad_norm": 0.43764325976371765,
+ "learning_rate": 4.865523472433942e-06,
+ "loss": 0.005,
+ "step": 143
+ },
+ {
+ "epoch": 1.6796875,
+ "grad_norm": 0.3445664644241333,
+ "learning_rate": 4.859256054995758e-06,
+ "loss": 0.0069,
+ "step": 144
+ },
+ {
+ "epoch": 1.69140625,
+ "grad_norm": 0.40410447120666504,
+ "learning_rate": 4.8528501199489045e-06,
+ "loss": 0.0088,
+ "step": 145
+ },
+ {
+ "epoch": 1.703125,
+ "grad_norm": 0.5876736640930176,
+ "learning_rate": 4.846306043401268e-06,
+ "loss": 0.0057,
+ "step": 146
+ },
+ {
+ "epoch": 1.71484375,
+ "grad_norm": 0.5149250626564026,
+ "learning_rate": 4.839624209571352e-06,
+ "loss": 0.0056,
+ "step": 147
+ },
+ {
+ "epoch": 1.7265625,
+ "grad_norm": 0.7009180784225464,
+ "learning_rate": 4.832805010765724e-06,
+ "loss": 0.0088,
+ "step": 148
+ },
+ {
+ "epoch": 1.73828125,
+ "grad_norm": 0.42258334159851074,
+ "learning_rate": 4.8258488473559794e-06,
+ "loss": 0.004,
+ "step": 149
+ },
+ {
+ "epoch": 1.75,
+ "grad_norm": 0.39231887459754944,
+ "learning_rate": 4.8187561277552376e-06,
+ "loss": 0.005,
+ "step": 150
+ },
+ {
+ "epoch": 1.76171875,
+ "grad_norm": 0.3317432701587677,
+ "learning_rate": 4.811527268394157e-06,
+ "loss": 0.0038,
+ "step": 151
+ },
+ {
+ "epoch": 1.7734375,
+ "grad_norm": 0.5022267699241638,
+ "learning_rate": 4.804162693696494e-06,
+ "loss": 0.0056,
+ "step": 152
+ },
+ {
+ "epoch": 1.78515625,
+ "grad_norm": 0.39019322395324707,
+ "learning_rate": 4.796662836054176e-06,
+ "loss": 0.0053,
+ "step": 153
+ },
+ {
+ "epoch": 1.796875,
+ "grad_norm": 0.5674042701721191,
+ "learning_rate": 4.789028135801919e-06,
+ "loss": 0.007,
+ "step": 154
+ },
+ {
+ "epoch": 1.80859375,
+ "grad_norm": 0.5690024495124817,
+ "learning_rate": 4.7812590411913755e-06,
+ "loss": 0.0053,
+ "step": 155
+ },
+ {
+ "epoch": 1.8203125,
+ "grad_norm": 0.23775412142276764,
+ "learning_rate": 4.773356008364812e-06,
+ "loss": 0.0031,
+ "step": 156
+ },
+ {
+ "epoch": 1.83203125,
+ "grad_norm": 0.4698558747768402,
+ "learning_rate": 4.765319501328332e-06,
+ "loss": 0.0021,
+ "step": 157
+ },
+ {
+ "epoch": 1.84375,
+ "grad_norm": 0.21603639423847198,
+ "learning_rate": 4.757149991924633e-06,
+ "loss": 0.0046,
+ "step": 158
+ },
+ {
+ "epoch": 1.85546875,
+ "grad_norm": 0.33830726146698,
+ "learning_rate": 4.748847959805297e-06,
+ "loss": 0.0022,
+ "step": 159
+ },
+ {
+ "epoch": 1.8671875,
+ "grad_norm": 0.44919782876968384,
+ "learning_rate": 4.740413892402639e-06,
+ "loss": 0.0032,
+ "step": 160
+ },
+ {
+ "epoch": 1.87890625,
+ "grad_norm": 0.5119614601135254,
+ "learning_rate": 4.731848284901082e-06,
+ "loss": 0.006,
+ "step": 161
+ },
+ {
+ "epoch": 1.890625,
+ "grad_norm": 0.3875437080860138,
+ "learning_rate": 4.723151640208084e-06,
+ "loss": 0.0024,
+ "step": 162
+ },
+ {
+ "epoch": 1.90234375,
+ "grad_norm": 0.3179910182952881,
+ "learning_rate": 4.714324468924614e-06,
+ "loss": 0.0037,
+ "step": 163
+ },
+ {
+ "epoch": 1.9140625,
+ "grad_norm": 0.43395644426345825,
+ "learning_rate": 4.705367289315172e-06,
+ "loss": 0.0027,
+ "step": 164
+ },
+ {
+ "epoch": 1.92578125,
+ "grad_norm": 0.3703945577144623,
+ "learning_rate": 4.696280627277356e-06,
+ "loss": 0.0047,
+ "step": 165
+ },
+ {
+ "epoch": 1.9375,
+ "grad_norm": 0.2503529191017151,
+ "learning_rate": 4.687065016310996e-06,
+ "loss": 0.0052,
+ "step": 166
+ },
+ {
+ "epoch": 1.94921875,
+ "grad_norm": 0.3613075315952301,
+ "learning_rate": 4.6777209974868194e-06,
+ "loss": 0.0034,
+ "step": 167
+ },
+ {
+ "epoch": 1.9609375,
+ "grad_norm": 0.3578515350818634,
+ "learning_rate": 4.668249119414692e-06,
+ "loss": 0.0021,
+ "step": 168
+ },
+ {
+ "epoch": 1.97265625,
+ "grad_norm": 0.1784515529870987,
+ "learning_rate": 4.6586499382113985e-06,
+ "loss": 0.0018,
+ "step": 169
+ },
+ {
+ "epoch": 1.984375,
+ "grad_norm": 0.259198397397995,
+ "learning_rate": 4.648924017468003e-06,
+ "loss": 0.0009,
+ "step": 170
+ }
+ ],
+ "logging_steps": 1,
+ "max_steps": 510,
+ "num_input_tokens_seen": 0,
+ "num_train_epochs": 6,
+ "save_steps": 85,
+ "stateful_callbacks": {
+ "TrainerControl": {
+ "args": {
+ "should_epoch_stop": false,
+ "should_evaluate": false,
+ "should_log": false,
+ "should_save": true,
+ "should_training_stop": false
+ },
+ "attributes": {}
+ }
+ },
+ "total_flos": 4.277422139847475e+17,
+ "train_batch_size": 4,
+ "trial_name": null,
+ "trial_params": null
+}
diff --git a/checkpoint-170/training_args.bin b/checkpoint-170/training_args.bin
new file mode 100644
index 0000000000000000000000000000000000000000..31435c2b54979c306fa2a089f64bc8d21e1d21cf
--- /dev/null
+++ b/checkpoint-170/training_args.bin
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:ae0e02a237d0ed5071f0d2c656d0cc6fa0293647ec7cffc6f8d299311f592cdc
+size 8056
diff --git a/checkpoint-170/zero_to_fp32.py b/checkpoint-170/zero_to_fp32.py
new file mode 100644
index 0000000000000000000000000000000000000000..24cc342e78d1a006c782b3a4cd68d9ce786d8fd8
--- /dev/null
+++ b/checkpoint-170/zero_to_fp32.py
@@ -0,0 +1,604 @@
+#!/usr/bin/env python
+
+# Copyright (c) Microsoft Corporation.
+# SPDX-License-Identifier: Apache-2.0
+
+# DeepSpeed Team
+
+# This script extracts fp32 consolidated weights from a zero 1, 2 and 3 DeepSpeed checkpoints. It gets
+# copied into the top level checkpoint dir, so the user can easily do the conversion at any point in
+# the future. Once extracted, the weights don't require DeepSpeed and can be used in any
+# application.
+#
+# example: python zero_to_fp32.py . pytorch_model.bin
+
+import argparse
+import torch
+import glob
+import math
+import os
+import re
+from collections import OrderedDict
+from dataclasses import dataclass
+
+# while this script doesn't use deepspeed to recover data, since the checkpoints are pickled with
+# DeepSpeed data structures it has to be available in the current python environment.
+from deepspeed.utils import logger
+from deepspeed.checkpoint.constants import (DS_VERSION, OPTIMIZER_STATE_DICT, SINGLE_PARTITION_OF_FP32_GROUPS,
+ FP32_FLAT_GROUPS, ZERO_STAGE, PARTITION_COUNT, PARAM_SHAPES, BUFFER_NAMES,
+ FROZEN_PARAM_SHAPES, FROZEN_PARAM_FRAGMENTS)
+
+
+@dataclass
+class zero_model_state:
+ buffers: dict()
+ param_shapes: dict()
+ shared_params: list
+ ds_version: int
+ frozen_param_shapes: dict()
+ frozen_param_fragments: dict()
+
+
+debug = 0
+
+# load to cpu
+device = torch.device('cpu')
+
+
+def atoi(text):
+ return int(text) if text.isdigit() else text
+
+
+def natural_keys(text):
+ '''
+ alist.sort(key=natural_keys) sorts in human order
+ http://nedbatchelder.com/blog/200712/human_sorting.html
+ (See Toothy's implementation in the comments)
+ '''
+ return [atoi(c) for c in re.split(r'(\d+)', text)]
+
+
+def get_model_state_file(checkpoint_dir, zero_stage):
+ if not os.path.isdir(checkpoint_dir):
+ raise FileNotFoundError(f"Directory '{checkpoint_dir}' doesn't exist")
+
+ # there should be only one file
+ if zero_stage <= 2:
+ file = os.path.join(checkpoint_dir, "mp_rank_00_model_states.pt")
+ elif zero_stage == 3:
+ file = os.path.join(checkpoint_dir, "zero_pp_rank_0_mp_rank_00_model_states.pt")
+
+ if not os.path.exists(file):
+ raise FileNotFoundError(f"can't find model states file at '{file}'")
+
+ return file
+
+
+def get_checkpoint_files(checkpoint_dir, glob_pattern):
+ # XXX: need to test that this simple glob rule works for multi-node setup too
+ ckpt_files = sorted(glob.glob(os.path.join(checkpoint_dir, glob_pattern)), key=natural_keys)
+
+ if len(ckpt_files) == 0:
+ raise FileNotFoundError(f"can't find {glob_pattern} files in directory '{checkpoint_dir}'")
+
+ return ckpt_files
+
+
+def get_optim_files(checkpoint_dir):
+ return get_checkpoint_files(checkpoint_dir, "*_optim_states.pt")
+
+
+def get_model_state_files(checkpoint_dir):
+ return get_checkpoint_files(checkpoint_dir, "*_model_states.pt")
+
+
+def parse_model_states(files):
+ zero_model_states = []
+ for file in files:
+ state_dict = torch.load(file, map_location=device)
+
+ if BUFFER_NAMES not in state_dict:
+ raise ValueError(f"{file} is not a model state checkpoint")
+ buffer_names = state_dict[BUFFER_NAMES]
+ if debug:
+ print("Found buffers:", buffer_names)
+
+ # recover just the buffers while restoring them to fp32 if they were saved in fp16
+ buffers = {k: v.float() for k, v in state_dict["module"].items() if k in buffer_names}
+ param_shapes = state_dict[PARAM_SHAPES]
+
+ # collect parameters that are included in param_shapes
+ param_names = []
+ for s in param_shapes:
+ for name in s.keys():
+ param_names.append(name)
+
+ # update with frozen parameters
+ frozen_param_shapes = state_dict.get(FROZEN_PARAM_SHAPES, None)
+ if frozen_param_shapes is not None:
+ if debug:
+ print(f"Found frozen_param_shapes: {frozen_param_shapes}")
+ param_names += list(frozen_param_shapes.keys())
+
+ # handle shared params
+ shared_params = [[k, v] for k, v in state_dict["shared_params"].items()]
+
+ ds_version = state_dict.get(DS_VERSION, None)
+
+ frozen_param_fragments = state_dict.get(FROZEN_PARAM_FRAGMENTS, None)
+
+ z_model_state = zero_model_state(buffers=buffers,
+ param_shapes=param_shapes,
+ shared_params=shared_params,
+ ds_version=ds_version,
+ frozen_param_shapes=frozen_param_shapes,
+ frozen_param_fragments=frozen_param_fragments)
+ zero_model_states.append(z_model_state)
+
+ return zero_model_states
+
+
+def parse_optim_states(files, ds_checkpoint_dir):
+
+ total_files = len(files)
+ state_dicts = []
+ for f in files:
+ state_dict = torch.load(f, map_location=device)
+ # immediately discard the potentially huge 2 optimizer states as we only care for fp32 master weights
+ # and also handle the case where it was already removed by another helper script
+ state_dict["optimizer_state_dict"].pop("optimizer_state_dict", None)
+ state_dicts.append(state_dict)
+
+ if not ZERO_STAGE in state_dicts[0][OPTIMIZER_STATE_DICT]:
+ raise ValueError(f"{files[0]} is not a zero checkpoint")
+ zero_stage = state_dicts[0][OPTIMIZER_STATE_DICT][ZERO_STAGE]
+ world_size = state_dicts[0][OPTIMIZER_STATE_DICT][PARTITION_COUNT]
+
+ # For ZeRO-2 each param group can have different partition_count as data parallelism for expert
+ # parameters can be different from data parallelism for non-expert parameters. So we can just
+ # use the max of the partition_count to get the dp world_size.
+
+ if type(world_size) is list:
+ world_size = max(world_size)
+
+ if world_size != total_files:
+ raise ValueError(
+ f"Expected {world_size} of '*_optim_states.pt' under '{ds_checkpoint_dir}' but found {total_files} files. "
+ "Possibly due to an overwrite of an old checkpoint, or a checkpoint didn't get saved by one or more processes."
+ )
+
+ # the groups are named differently in each stage
+ if zero_stage <= 2:
+ fp32_groups_key = SINGLE_PARTITION_OF_FP32_GROUPS
+ elif zero_stage == 3:
+ fp32_groups_key = FP32_FLAT_GROUPS
+ else:
+ raise ValueError(f"unknown zero stage {zero_stage}")
+
+ if zero_stage <= 2:
+ fp32_flat_groups = [state_dicts[i][OPTIMIZER_STATE_DICT][fp32_groups_key] for i in range(len(state_dicts))]
+ elif zero_stage == 3:
+ # if there is more than one param group, there will be multiple flattened tensors - one
+ # flattened tensor per group - for simplicity merge them into a single tensor
+ #
+ # XXX: could make the script more memory efficient for when there are multiple groups - it
+ # will require matching the sub-lists of param_shapes for each param group flattened tensor
+
+ fp32_flat_groups = [
+ torch.cat(state_dicts[i][OPTIMIZER_STATE_DICT][fp32_groups_key], 0) for i in range(len(state_dicts))
+ ]
+
+ return zero_stage, world_size, fp32_flat_groups
+
+
+def _get_fp32_state_dict_from_zero_checkpoint(ds_checkpoint_dir, exclude_frozen_parameters):
+ """
+ Returns fp32 state_dict reconstructed from ds checkpoint
+
+ Args:
+ - ``ds_checkpoint_dir``: path to the deepspeed checkpoint folder (where the optimizer files are)
+
+ """
+ print(f"Processing zero checkpoint '{ds_checkpoint_dir}'")
+
+ optim_files = get_optim_files(ds_checkpoint_dir)
+ zero_stage, world_size, fp32_flat_groups = parse_optim_states(optim_files, ds_checkpoint_dir)
+ print(f"Detected checkpoint of type zero stage {zero_stage}, world_size: {world_size}")
+
+ model_files = get_model_state_files(ds_checkpoint_dir)
+
+ zero_model_states = parse_model_states(model_files)
+ print(f'Parsing checkpoint created by deepspeed=={zero_model_states[0].ds_version}')
+
+ if zero_stage <= 2:
+ return _get_fp32_state_dict_from_zero2_checkpoint(world_size, fp32_flat_groups, zero_model_states,
+ exclude_frozen_parameters)
+ elif zero_stage == 3:
+ return _get_fp32_state_dict_from_zero3_checkpoint(world_size, fp32_flat_groups, zero_model_states,
+ exclude_frozen_parameters)
+
+
+def _zero2_merge_frozen_params(state_dict, zero_model_states):
+ if zero_model_states[0].frozen_param_shapes is None or len(zero_model_states[0].frozen_param_shapes) == 0:
+ return
+
+ frozen_param_shapes = zero_model_states[0].frozen_param_shapes
+ frozen_param_fragments = zero_model_states[0].frozen_param_fragments
+
+ if debug:
+ num_elem = sum(s.numel() for s in frozen_param_shapes.values())
+ print(f'rank 0: {FROZEN_PARAM_SHAPES}.numel = {num_elem}')
+
+ wanted_params = len(frozen_param_shapes)
+ wanted_numel = sum(s.numel() for s in frozen_param_shapes.values())
+ avail_numel = sum([p.numel() for p in frozen_param_fragments.values()])
+ print(f'Frozen params: Have {avail_numel} numels to process.')
+ print(f'Frozen params: Need {wanted_numel} numels in {wanted_params} params')
+
+ total_params = 0
+ total_numel = 0
+ for name, shape in frozen_param_shapes.items():
+ total_params += 1
+ unpartitioned_numel = shape.numel()
+ total_numel += unpartitioned_numel
+
+ state_dict[name] = frozen_param_fragments[name]
+
+ if debug:
+ print(f"{name} full shape: {shape} unpartitioned numel {unpartitioned_numel} ")
+
+ print(f"Reconstructed Frozen fp32 state dict with {total_params} params {total_numel} elements")
+
+
+def _has_callable(obj, fn):
+ attr = getattr(obj, fn, None)
+ return callable(attr)
+
+
+def _zero2_merge_trainable_params(state_dict, world_size, fp32_flat_groups, zero_model_states):
+ param_shapes = zero_model_states[0].param_shapes
+
+ # Reconstruction protocol:
+ #
+ # XXX: document this
+
+ if debug:
+ for i in range(world_size):
+ for j in range(len(fp32_flat_groups[0])):
+ print(f"{FP32_FLAT_GROUPS}[{i}][{j}].shape={fp32_flat_groups[i][j].shape}")
+
+ # XXX: memory usage doubles here (zero2)
+ num_param_groups = len(fp32_flat_groups[0])
+ merged_single_partition_of_fp32_groups = []
+ for i in range(num_param_groups):
+ merged_partitions = [sd[i] for sd in fp32_flat_groups]
+ full_single_fp32_vector = torch.cat(merged_partitions, 0)
+ merged_single_partition_of_fp32_groups.append(full_single_fp32_vector)
+ avail_numel = sum(
+ [full_single_fp32_vector.numel() for full_single_fp32_vector in merged_single_partition_of_fp32_groups])
+
+ if debug:
+ wanted_params = sum([len(shapes) for shapes in param_shapes])
+ wanted_numel = sum([sum(shape.numel() for shape in shapes.values()) for shapes in param_shapes])
+ # not asserting if there is a mismatch due to possible padding
+ print(f"Have {avail_numel} numels to process.")
+ print(f"Need {wanted_numel} numels in {wanted_params} params.")
+
+ # params
+ # XXX: for huge models that can't fit into the host's RAM we will have to recode this to support
+ # out-of-core computing solution
+ total_numel = 0
+ total_params = 0
+ for shapes, full_single_fp32_vector in zip(param_shapes, merged_single_partition_of_fp32_groups):
+ offset = 0
+ avail_numel = full_single_fp32_vector.numel()
+ for name, shape in shapes.items():
+
+ unpartitioned_numel = shape.numel() if _has_callable(shape, 'numel') else math.prod(shape)
+ total_numel += unpartitioned_numel
+ total_params += 1
+
+ if debug:
+ print(f"{name} full shape: {shape} unpartitioned numel {unpartitioned_numel} ")
+ state_dict[name] = full_single_fp32_vector.narrow(0, offset, unpartitioned_numel).view(shape)
+ offset += unpartitioned_numel
+
+ # Z2 started to align to 2*world_size to improve nccl performance. Therefore both offset and
+ # avail_numel can differ by anywhere between 0..2*world_size. Due to two unrelated complex
+ # paddings performed in the code it's almost impossible to predict the exact numbers w/o the
+ # live optimizer object, so we are checking that the numbers are within the right range
+ align_to = 2 * world_size
+
+ def zero2_align(x):
+ return align_to * math.ceil(x / align_to)
+
+ if debug:
+ print(f"original offset={offset}, avail_numel={avail_numel}")
+
+ offset = zero2_align(offset)
+ avail_numel = zero2_align(avail_numel)
+
+ if debug:
+ print(f"aligned offset={offset}, avail_numel={avail_numel}")
+
+ # Sanity check
+ if offset != avail_numel:
+ raise ValueError(f"consumed {offset} numels out of {avail_numel} - something is wrong")
+
+ print(f"Reconstructed fp32 state dict with {total_params} params {total_numel} elements")
+
+
+def _get_fp32_state_dict_from_zero2_checkpoint(world_size, fp32_flat_groups, zero_model_states,
+ exclude_frozen_parameters):
+ state_dict = OrderedDict()
+
+ # buffers
+ buffers = zero_model_states[0].buffers
+ state_dict.update(buffers)
+ if debug:
+ print(f"added {len(buffers)} buffers")
+
+ if not exclude_frozen_parameters:
+ _zero2_merge_frozen_params(state_dict, zero_model_states)
+
+ _zero2_merge_trainable_params(state_dict, world_size, fp32_flat_groups, zero_model_states)
+
+ # recover shared parameters
+ for pair in zero_model_states[0].shared_params:
+ if pair[1] in state_dict:
+ state_dict[pair[0]] = state_dict[pair[1]]
+
+ return state_dict
+
+
+def zero3_partitioned_param_info(unpartitioned_numel, world_size):
+ remainder = unpartitioned_numel % world_size
+ padding_numel = (world_size - remainder) if remainder else 0
+ partitioned_numel = math.ceil(unpartitioned_numel / world_size)
+ return partitioned_numel, padding_numel
+
+
+def _zero3_merge_frozen_params(state_dict, world_size, zero_model_states):
+ if zero_model_states[0].frozen_param_shapes is None or len(zero_model_states[0].frozen_param_shapes) == 0:
+ return
+
+ if debug:
+ for i in range(world_size):
+ num_elem = sum(s.numel() for s in zero_model_states[i].frozen_param_fragments.values())
+ print(f'rank {i}: {FROZEN_PARAM_SHAPES}.numel = {num_elem}')
+
+ frozen_param_shapes = zero_model_states[0].frozen_param_shapes
+ wanted_params = len(frozen_param_shapes)
+ wanted_numel = sum(s.numel() for s in frozen_param_shapes.values())
+ avail_numel = sum([p.numel() for p in zero_model_states[0].frozen_param_fragments.values()]) * world_size
+ print(f'Frozen params: Have {avail_numel} numels to process.')
+ print(f'Frozen params: Need {wanted_numel} numels in {wanted_params} params')
+
+ total_params = 0
+ total_numel = 0
+ for name, shape in zero_model_states[0].frozen_param_shapes.items():
+ total_params += 1
+ unpartitioned_numel = shape.numel()
+ total_numel += unpartitioned_numel
+
+ param_frags = tuple(model_state.frozen_param_fragments[name] for model_state in zero_model_states)
+ state_dict[name] = torch.cat(param_frags, 0).narrow(0, 0, unpartitioned_numel).view(shape)
+
+ partitioned_numel, partitioned_padding_numel = zero3_partitioned_param_info(unpartitioned_numel, world_size)
+
+ if debug:
+ print(
+ f"Frozen params: {total_params} {name} full shape: {shape} partition0 numel={partitioned_numel} partitioned_padding_numel={partitioned_padding_numel}"
+ )
+
+ print(f"Reconstructed Frozen fp32 state dict with {total_params} params {total_numel} elements")
+
+
+def _zero3_merge_trainable_params(state_dict, world_size, fp32_flat_groups, zero_model_states):
+ param_shapes = zero_model_states[0].param_shapes
+ avail_numel = fp32_flat_groups[0].numel() * world_size
+ # Reconstruction protocol: For zero3 we need to zip the partitions together at boundary of each
+ # param, re-consolidating each param, while dealing with padding if any
+
+ # merge list of dicts, preserving order
+ param_shapes = {k: v for d in param_shapes for k, v in d.items()}
+
+ if debug:
+ for i in range(world_size):
+ print(f"{FP32_FLAT_GROUPS}[{i}].shape={fp32_flat_groups[i].shape}")
+
+ wanted_params = len(param_shapes)
+ wanted_numel = sum(shape.numel() for shape in param_shapes.values())
+ # not asserting if there is a mismatch due to possible padding
+ avail_numel = fp32_flat_groups[0].numel() * world_size
+ print(f"Trainable params: Have {avail_numel} numels to process.")
+ print(f"Trainable params: Need {wanted_numel} numels in {wanted_params} params.")
+
+ # params
+ # XXX: for huge models that can't fit into the host's RAM we will have to recode this to support
+ # out-of-core computing solution
+ offset = 0
+ total_numel = 0
+ total_params = 0
+ for name, shape in param_shapes.items():
+
+ unpartitioned_numel = shape.numel()
+ total_numel += unpartitioned_numel
+ total_params += 1
+
+ partitioned_numel, partitioned_padding_numel = zero3_partitioned_param_info(unpartitioned_numel, world_size)
+
+ if debug:
+ print(
+ f"Trainable params: {total_params} {name} full shape: {shape} partition0 numel={partitioned_numel} partitioned_padding_numel={partitioned_padding_numel}"
+ )
+
+ # XXX: memory usage doubles here
+ state_dict[name] = torch.cat(
+ tuple(fp32_flat_groups[i].narrow(0, offset, partitioned_numel) for i in range(world_size)),
+ 0).narrow(0, 0, unpartitioned_numel).view(shape)
+ offset += partitioned_numel
+
+ offset *= world_size
+
+ # Sanity check
+ if offset != avail_numel:
+ raise ValueError(f"consumed {offset} numels out of {avail_numel} - something is wrong")
+
+ print(f"Reconstructed Trainable fp32 state dict with {total_params} params {total_numel} elements")
+
+
+def _get_fp32_state_dict_from_zero3_checkpoint(world_size, fp32_flat_groups, zero_model_states,
+ exclude_frozen_parameters):
+ state_dict = OrderedDict()
+
+ # buffers
+ buffers = zero_model_states[0].buffers
+ state_dict.update(buffers)
+ if debug:
+ print(f"added {len(buffers)} buffers")
+
+ if not exclude_frozen_parameters:
+ _zero3_merge_frozen_params(state_dict, world_size, zero_model_states)
+
+ _zero3_merge_trainable_params(state_dict, world_size, fp32_flat_groups, zero_model_states)
+
+ # recover shared parameters
+ for pair in zero_model_states[0].shared_params:
+ if pair[1] in state_dict:
+ state_dict[pair[0]] = state_dict[pair[1]]
+
+ return state_dict
+
+
+def get_fp32_state_dict_from_zero_checkpoint(checkpoint_dir, tag=None, exclude_frozen_parameters=False):
+ """
+ Convert ZeRO 2 or 3 checkpoint into a single fp32 consolidated state_dict that can be loaded with
+ ``load_state_dict()`` and used for training without DeepSpeed or shared with others, for example
+ via a model hub.
+
+ Args:
+ - ``checkpoint_dir``: path to the desired checkpoint folder
+ - ``tag``: checkpoint tag used as a unique identifier for checkpoint. If not provided will attempt to load tag in 'latest' file. e.g., ``global_step14``
+ - ``exclude_frozen_parameters``: exclude frozen parameters
+
+ Returns:
+ - pytorch ``state_dict``
+
+ Note: this approach may not work if your application doesn't have sufficient free CPU memory and
+ you may need to use the offline approach using the ``zero_to_fp32.py`` script that is saved with
+ the checkpoint.
+
+ A typical usage might be ::
+
+ from deepspeed.utils.zero_to_fp32 import get_fp32_state_dict_from_zero_checkpoint
+ # do the training and checkpoint saving
+ state_dict = get_fp32_state_dict_from_zero_checkpoint(checkpoint_dir) # already on cpu
+ model = model.cpu() # move to cpu
+ model.load_state_dict(state_dict)
+ # submit to model hub or save the model to share with others
+
+ In this example the ``model`` will no longer be usable in the deepspeed context of the same
+ application. i.e. you will need to re-initialize the deepspeed engine, since
+ ``model.load_state_dict(state_dict)`` will remove all the deepspeed magic from it.
+
+ If you want it all done for you, use ``load_state_dict_from_zero_checkpoint`` instead.
+
+ """
+ if tag is None:
+ latest_path = os.path.join(checkpoint_dir, 'latest')
+ if os.path.isfile(latest_path):
+ with open(latest_path, 'r') as fd:
+ tag = fd.read().strip()
+ else:
+ raise ValueError(f"Unable to find 'latest' file at {latest_path}")
+
+ ds_checkpoint_dir = os.path.join(checkpoint_dir, tag)
+
+ if not os.path.isdir(ds_checkpoint_dir):
+ raise FileNotFoundError(f"Directory '{ds_checkpoint_dir}' doesn't exist")
+
+ return _get_fp32_state_dict_from_zero_checkpoint(ds_checkpoint_dir, exclude_frozen_parameters)
+
+
+def convert_zero_checkpoint_to_fp32_state_dict(checkpoint_dir, output_file, tag=None, exclude_frozen_parameters=False):
+ """
+ Convert ZeRO 2 or 3 checkpoint into a single fp32 consolidated ``state_dict`` file that can be
+ loaded with ``torch.load(file)`` + ``load_state_dict()`` and used for training without DeepSpeed.
+
+ Args:
+ - ``checkpoint_dir``: path to the desired checkpoint folder. (one that contains the tag-folder, like ``global_step14``)
+ - ``output_file``: path to the pytorch fp32 state_dict output file (e.g. path/pytorch_model.bin)
+ - ``tag``: checkpoint tag used as a unique identifier for checkpoint. If not provided will attempt to load tag in the file named ``latest`` in the checkpoint folder, e.g., ``global_step14``
+ - ``exclude_frozen_parameters``: exclude frozen parameters
+ """
+
+ state_dict = get_fp32_state_dict_from_zero_checkpoint(checkpoint_dir, tag, exclude_frozen_parameters)
+ print(f"Saving fp32 state dict to {output_file}")
+ torch.save(state_dict, output_file)
+
+
+def load_state_dict_from_zero_checkpoint(model, checkpoint_dir, tag=None):
+ """
+ 1. Put the provided model to cpu
+ 2. Convert ZeRO 2 or 3 checkpoint into a single fp32 consolidated ``state_dict``
+ 3. Load it into the provided model
+
+ Args:
+ - ``model``: the model object to update
+ - ``checkpoint_dir``: path to the desired checkpoint folder. (one that contains the tag-folder, like ``global_step14``)
+ - ``tag``: checkpoint tag used as a unique identifier for checkpoint. If not provided will attempt to load tag in the file named ``latest`` in the checkpoint folder, e.g., ``global_step14``
+
+ Returns:
+ - ``model`: modified model
+
+ Make sure you have plenty of CPU memory available before you call this function. If you don't
+ have enough use the ``zero_to_fp32.py`` utility to do the conversion. You will find it
+ conveniently placed for you in the checkpoint folder.
+
+ A typical usage might be ::
+
+ from deepspeed.utils.zero_to_fp32 import load_state_dict_from_zero_checkpoint
+ model = load_state_dict_from_zero_checkpoint(trainer.model, checkpoint_dir)
+ # submit to model hub or save the model to share with others
+
+ Note, that once this was run, the ``model`` will no longer be usable in the deepspeed context
+ of the same application. i.e. you will need to re-initialize the deepspeed engine, since
+ ``model.load_state_dict(state_dict)`` will remove all the deepspeed magic from it.
+
+ """
+ logger.info(f"Extracting fp32 weights")
+ state_dict = get_fp32_state_dict_from_zero_checkpoint(checkpoint_dir, tag)
+
+ logger.info(f"Overwriting model with fp32 weights")
+ model = model.cpu()
+ model.load_state_dict(state_dict, strict=False)
+
+ return model
+
+
+if __name__ == "__main__":
+
+ parser = argparse.ArgumentParser()
+ parser.add_argument("checkpoint_dir",
+ type=str,
+ help="path to the desired checkpoint folder, e.g., path/checkpoint-12")
+ parser.add_argument(
+ "output_file",
+ type=str,
+ help="path to the pytorch fp32 state_dict output file (e.g. path/checkpoint-12/pytorch_model.bin)")
+ parser.add_argument("-t",
+ "--tag",
+ type=str,
+ default=None,
+ help="checkpoint tag used as a unique identifier for checkpoint. e.g., global_step1")
+ parser.add_argument("--exclude_frozen_parameters", action='store_true', help="exclude frozen parameters")
+ parser.add_argument("-d", "--debug", action='store_true', help="enable debug")
+ args = parser.parse_args()
+
+ debug = args.debug
+
+ convert_zero_checkpoint_to_fp32_state_dict(args.checkpoint_dir,
+ args.output_file,
+ tag=args.tag,
+ exclude_frozen_parameters=args.exclude_frozen_parameters)
diff --git a/checkpoint-255/README.md b/checkpoint-255/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..be5c87703f12b547886cc6a2ecfbe9ee150496fa
--- /dev/null
+++ b/checkpoint-255/README.md
@@ -0,0 +1,202 @@
+---
+base_model: meta-llama/Llama-3.1-8B-Instruct
+library_name: peft
+---
+
+# Model Card for Model ID
+
+
+
+
+
+## Model Details
+
+### Model Description
+
+
+
+
+
+- **Developed by:** [More Information Needed]
+- **Funded by [optional]:** [More Information Needed]
+- **Shared by [optional]:** [More Information Needed]
+- **Model type:** [More Information Needed]
+- **Language(s) (NLP):** [More Information Needed]
+- **License:** [More Information Needed]
+- **Finetuned from model [optional]:** [More Information Needed]
+
+### Model Sources [optional]
+
+
+
+- **Repository:** [More Information Needed]
+- **Paper [optional]:** [More Information Needed]
+- **Demo [optional]:** [More Information Needed]
+
+## Uses
+
+
+
+### Direct Use
+
+
+
+[More Information Needed]
+
+### Downstream Use [optional]
+
+
+
+[More Information Needed]
+
+### Out-of-Scope Use
+
+
+
+[More Information Needed]
+
+## Bias, Risks, and Limitations
+
+
+
+[More Information Needed]
+
+### Recommendations
+
+
+
+Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
+
+## How to Get Started with the Model
+
+Use the code below to get started with the model.
+
+[More Information Needed]
+
+## Training Details
+
+### Training Data
+
+
+
+[More Information Needed]
+
+### Training Procedure
+
+
+
+#### Preprocessing [optional]
+
+[More Information Needed]
+
+
+#### Training Hyperparameters
+
+- **Training regime:** [More Information Needed]
+
+#### Speeds, Sizes, Times [optional]
+
+
+
+[More Information Needed]
+
+## Evaluation
+
+
+
+### Testing Data, Factors & Metrics
+
+#### Testing Data
+
+
+
+[More Information Needed]
+
+#### Factors
+
+
+
+[More Information Needed]
+
+#### Metrics
+
+
+
+[More Information Needed]
+
+### Results
+
+[More Information Needed]
+
+#### Summary
+
+
+
+## Model Examination [optional]
+
+
+
+[More Information Needed]
+
+## Environmental Impact
+
+
+
+Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
+
+- **Hardware Type:** [More Information Needed]
+- **Hours used:** [More Information Needed]
+- **Cloud Provider:** [More Information Needed]
+- **Compute Region:** [More Information Needed]
+- **Carbon Emitted:** [More Information Needed]
+
+## Technical Specifications [optional]
+
+### Model Architecture and Objective
+
+[More Information Needed]
+
+### Compute Infrastructure
+
+[More Information Needed]
+
+#### Hardware
+
+[More Information Needed]
+
+#### Software
+
+[More Information Needed]
+
+## Citation [optional]
+
+
+
+**BibTeX:**
+
+[More Information Needed]
+
+**APA:**
+
+[More Information Needed]
+
+## Glossary [optional]
+
+
+
+[More Information Needed]
+
+## More Information [optional]
+
+[More Information Needed]
+
+## Model Card Authors [optional]
+
+[More Information Needed]
+
+## Model Card Contact
+
+[More Information Needed]
+### Framework versions
+
+- PEFT 0.14.0
\ No newline at end of file
diff --git a/checkpoint-255/adapter_config.json b/checkpoint-255/adapter_config.json
new file mode 100644
index 0000000000000000000000000000000000000000..9dfb3ab60881d002c4cdbcc157a93958018fe683
--- /dev/null
+++ b/checkpoint-255/adapter_config.json
@@ -0,0 +1,40 @@
+{
+ "alpha_pattern": {},
+ "auto_mapping": null,
+ "base_model_name_or_path": "meta-llama/Llama-3.1-8B-Instruct",
+ "bias": "none",
+ "eva_config": null,
+ "exclude_modules": null,
+ "fan_in_fan_out": null,
+ "inference_mode": true,
+ "init_lora_weights": true,
+ "layer_replication": null,
+ "layers_pattern": null,
+ "layers_to_transform": null,
+ "loftq_config": {},
+ "lora_alpha": 512,
+ "lora_bias": false,
+ "lora_dropout": 0.05,
+ "megatron_config": null,
+ "megatron_core": "megatron.core",
+ "modules_to_save": [
+ "embed_tokens",
+ "lm_head"
+ ],
+ "peft_type": "LORA",
+ "r": 256,
+ "rank_pattern": {},
+ "revision": null,
+ "target_modules": [
+ "v_proj",
+ "up_proj",
+ "q_proj",
+ "o_proj",
+ "down_proj",
+ "gate_proj",
+ "k_proj"
+ ],
+ "task_type": "CAUSAL_LM",
+ "use_dora": false,
+ "use_rslora": false
+}
\ No newline at end of file
diff --git a/checkpoint-255/adapter_model.safetensors b/checkpoint-255/adapter_model.safetensors
new file mode 100644
index 0000000000000000000000000000000000000000..ff2696b824222f69fbf9f1b24cd8c1bd931acbac
--- /dev/null
+++ b/checkpoint-255/adapter_model.safetensors
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:5a48990210509c48f9f94cd11602df092046cbddf5bfcd28e40f9914c89eb45c
+size 3443586272
diff --git a/checkpoint-255/global_step253/bf16_zero_pp_rank_0_mp_rank_00_optim_states.pt b/checkpoint-255/global_step253/bf16_zero_pp_rank_0_mp_rank_00_optim_states.pt
new file mode 100644
index 0000000000000000000000000000000000000000..2b193afd1efa71741e3fe804350250caba3ebe6c
--- /dev/null
+++ b/checkpoint-255/global_step253/bf16_zero_pp_rank_0_mp_rank_00_optim_states.pt
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:cf6834dcbc5d08773eb9d55e25eb6c9c1f4f0c6c98a3372069cfe2a16a9757cc
+size 20661195036
diff --git a/checkpoint-255/global_step253/mp_rank_00_model_states.pt b/checkpoint-255/global_step253/mp_rank_00_model_states.pt
new file mode 100644
index 0000000000000000000000000000000000000000..e2b3da0e3a7130866d82f1a48bf593ca0d5e047b
--- /dev/null
+++ b/checkpoint-255/global_step253/mp_rank_00_model_states.pt
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:aedc5cbe211546493630010818a87565a9c3fb0fa7380397a795d2134529284a
+size 3555326649
diff --git a/checkpoint-255/latest b/checkpoint-255/latest
new file mode 100644
index 0000000000000000000000000000000000000000..774a43dd2e7d8685b2d1ed7f2479ccef4e9d94b6
--- /dev/null
+++ b/checkpoint-255/latest
@@ -0,0 +1 @@
+global_step253
\ No newline at end of file
diff --git a/checkpoint-255/rng_state.pth b/checkpoint-255/rng_state.pth
new file mode 100644
index 0000000000000000000000000000000000000000..84ac8626f5f414022c663fecd652f12138c45752
--- /dev/null
+++ b/checkpoint-255/rng_state.pth
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:1d0f20f628ef241fba179d83cde1bc81e7b8ca798424e836d6a0fe623ce55c9a
+size 14244
diff --git a/checkpoint-255/scheduler.pt b/checkpoint-255/scheduler.pt
new file mode 100644
index 0000000000000000000000000000000000000000..f89a25d648414d0d63e2f96319e91a95207d115f
--- /dev/null
+++ b/checkpoint-255/scheduler.pt
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:5533764299361c340a5f933b830300a4da7499e1b2ad60a0e91b6b9df0055794
+size 1064
diff --git a/checkpoint-255/special_tokens_map.json b/checkpoint-255/special_tokens_map.json
new file mode 100644
index 0000000000000000000000000000000000000000..278b7f0f84be865c4687700ee7b3c63d89a51e18
--- /dev/null
+++ b/checkpoint-255/special_tokens_map.json
@@ -0,0 +1,23 @@
+{
+ "bos_token": {
+ "content": "<|begin_of_text|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false
+ },
+ "eos_token": {
+ "content": "<|eot_id|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false
+ },
+ "pad_token": {
+ "content": "<|end_of_text|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false
+ }
+}
diff --git a/checkpoint-255/tokenizer.json b/checkpoint-255/tokenizer.json
new file mode 100644
index 0000000000000000000000000000000000000000..1c1d8d5c9024994f1d3b00f9662b8dd89ca13cf2
--- /dev/null
+++ b/checkpoint-255/tokenizer.json
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:6b9e4e7fb171f92fd137b777cc2714bf87d11576700a1dcd7a399e7bbe39537b
+size 17209920
diff --git a/checkpoint-255/tokenizer_config.json b/checkpoint-255/tokenizer_config.json
new file mode 100644
index 0000000000000000000000000000000000000000..ca91a2ef55f4239a7af81d7c9abb05f53621a07b
--- /dev/null
+++ b/checkpoint-255/tokenizer_config.json
@@ -0,0 +1,2064 @@
+{
+ "added_tokens_decoder": {
+ "128000": {
+ "content": "<|begin_of_text|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128001": {
+ "content": "<|end_of_text|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128002": {
+ "content": "<|reserved_special_token_0|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128003": {
+ "content": "<|reserved_special_token_1|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128004": {
+ "content": "<|finetune_right_pad_id|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128005": {
+ "content": "<|reserved_special_token_2|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128006": {
+ "content": "<|start_header_id|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128007": {
+ "content": "<|end_header_id|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128008": {
+ "content": "<|eom_id|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128009": {
+ "content": "<|eot_id|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128010": {
+ "content": "<|python_tag|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128011": {
+ "content": "<|reserved_special_token_3|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128012": {
+ "content": "<|reserved_special_token_4|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128013": {
+ "content": "<|reserved_special_token_5|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128014": {
+ "content": "<|reserved_special_token_6|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128015": {
+ "content": "<|reserved_special_token_7|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128016": {
+ "content": "<|reserved_special_token_8|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128017": {
+ "content": "<|reserved_special_token_9|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128018": {
+ "content": "<|reserved_special_token_10|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128019": {
+ "content": "<|reserved_special_token_11|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128020": {
+ "content": "<|reserved_special_token_12|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128021": {
+ "content": "<|reserved_special_token_13|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128022": {
+ "content": "<|reserved_special_token_14|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128023": {
+ "content": "<|reserved_special_token_15|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128024": {
+ "content": "<|reserved_special_token_16|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128025": {
+ "content": "<|reserved_special_token_17|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128026": {
+ "content": "<|reserved_special_token_18|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128027": {
+ "content": "<|reserved_special_token_19|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128028": {
+ "content": "<|reserved_special_token_20|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128029": {
+ "content": "<|reserved_special_token_21|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128030": {
+ "content": "<|reserved_special_token_22|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128031": {
+ "content": "<|reserved_special_token_23|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128032": {
+ "content": "<|reserved_special_token_24|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128033": {
+ "content": "<|reserved_special_token_25|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128034": {
+ "content": "<|reserved_special_token_26|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128035": {
+ "content": "<|reserved_special_token_27|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128036": {
+ "content": "<|reserved_special_token_28|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128037": {
+ "content": "<|reserved_special_token_29|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128038": {
+ "content": "<|reserved_special_token_30|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128039": {
+ "content": "<|reserved_special_token_31|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128040": {
+ "content": "<|reserved_special_token_32|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128041": {
+ "content": "<|reserved_special_token_33|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128042": {
+ "content": "<|reserved_special_token_34|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128043": {
+ "content": "<|reserved_special_token_35|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128044": {
+ "content": "<|reserved_special_token_36|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128045": {
+ "content": "<|reserved_special_token_37|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128046": {
+ "content": "<|reserved_special_token_38|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128047": {
+ "content": "<|reserved_special_token_39|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128048": {
+ "content": "<|reserved_special_token_40|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128049": {
+ "content": "<|reserved_special_token_41|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128050": {
+ "content": "<|reserved_special_token_42|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128051": {
+ "content": "<|reserved_special_token_43|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128052": {
+ "content": "<|reserved_special_token_44|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128053": {
+ "content": "<|reserved_special_token_45|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128054": {
+ "content": "<|reserved_special_token_46|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128055": {
+ "content": "<|reserved_special_token_47|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128056": {
+ "content": "<|reserved_special_token_48|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128057": {
+ "content": "<|reserved_special_token_49|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128058": {
+ "content": "<|reserved_special_token_50|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128059": {
+ "content": "<|reserved_special_token_51|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128060": {
+ "content": "<|reserved_special_token_52|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128061": {
+ "content": "<|reserved_special_token_53|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128062": {
+ "content": "<|reserved_special_token_54|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128063": {
+ "content": "<|reserved_special_token_55|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128064": {
+ "content": "<|reserved_special_token_56|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128065": {
+ "content": "<|reserved_special_token_57|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128066": {
+ "content": "<|reserved_special_token_58|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128067": {
+ "content": "<|reserved_special_token_59|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128068": {
+ "content": "<|reserved_special_token_60|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128069": {
+ "content": "<|reserved_special_token_61|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128070": {
+ "content": "<|reserved_special_token_62|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128071": {
+ "content": "<|reserved_special_token_63|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128072": {
+ "content": "<|reserved_special_token_64|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128073": {
+ "content": "<|reserved_special_token_65|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128074": {
+ "content": "<|reserved_special_token_66|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128075": {
+ "content": "<|reserved_special_token_67|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128076": {
+ "content": "<|reserved_special_token_68|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128077": {
+ "content": "<|reserved_special_token_69|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128078": {
+ "content": "<|reserved_special_token_70|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128079": {
+ "content": "<|reserved_special_token_71|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128080": {
+ "content": "<|reserved_special_token_72|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128081": {
+ "content": "<|reserved_special_token_73|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128082": {
+ "content": "<|reserved_special_token_74|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128083": {
+ "content": "<|reserved_special_token_75|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128084": {
+ "content": "<|reserved_special_token_76|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128085": {
+ "content": "<|reserved_special_token_77|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128086": {
+ "content": "<|reserved_special_token_78|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128087": {
+ "content": "<|reserved_special_token_79|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128088": {
+ "content": "<|reserved_special_token_80|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128089": {
+ "content": "<|reserved_special_token_81|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128090": {
+ "content": "<|reserved_special_token_82|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128091": {
+ "content": "<|reserved_special_token_83|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128092": {
+ "content": "<|reserved_special_token_84|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128093": {
+ "content": "<|reserved_special_token_85|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128094": {
+ "content": "<|reserved_special_token_86|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128095": {
+ "content": "<|reserved_special_token_87|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128096": {
+ "content": "<|reserved_special_token_88|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128097": {
+ "content": "<|reserved_special_token_89|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128098": {
+ "content": "<|reserved_special_token_90|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128099": {
+ "content": "<|reserved_special_token_91|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128100": {
+ "content": "<|reserved_special_token_92|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128101": {
+ "content": "<|reserved_special_token_93|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128102": {
+ "content": "<|reserved_special_token_94|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128103": {
+ "content": "<|reserved_special_token_95|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128104": {
+ "content": "<|reserved_special_token_96|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128105": {
+ "content": "<|reserved_special_token_97|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128106": {
+ "content": "<|reserved_special_token_98|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128107": {
+ "content": "<|reserved_special_token_99|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128108": {
+ "content": "<|reserved_special_token_100|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128109": {
+ "content": "<|reserved_special_token_101|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128110": {
+ "content": "<|reserved_special_token_102|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128111": {
+ "content": "<|reserved_special_token_103|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128112": {
+ "content": "<|reserved_special_token_104|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128113": {
+ "content": "<|reserved_special_token_105|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128114": {
+ "content": "<|reserved_special_token_106|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128115": {
+ "content": "<|reserved_special_token_107|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128116": {
+ "content": "<|reserved_special_token_108|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128117": {
+ "content": "<|reserved_special_token_109|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128118": {
+ "content": "<|reserved_special_token_110|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128119": {
+ "content": "<|reserved_special_token_111|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128120": {
+ "content": "<|reserved_special_token_112|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128121": {
+ "content": "<|reserved_special_token_113|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128122": {
+ "content": "<|reserved_special_token_114|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128123": {
+ "content": "<|reserved_special_token_115|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128124": {
+ "content": "<|reserved_special_token_116|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128125": {
+ "content": "<|reserved_special_token_117|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128126": {
+ "content": "<|reserved_special_token_118|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128127": {
+ "content": "<|reserved_special_token_119|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128128": {
+ "content": "<|reserved_special_token_120|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128129": {
+ "content": "<|reserved_special_token_121|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128130": {
+ "content": "<|reserved_special_token_122|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128131": {
+ "content": "<|reserved_special_token_123|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128132": {
+ "content": "<|reserved_special_token_124|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128133": {
+ "content": "<|reserved_special_token_125|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128134": {
+ "content": "<|reserved_special_token_126|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128135": {
+ "content": "<|reserved_special_token_127|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128136": {
+ "content": "<|reserved_special_token_128|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128137": {
+ "content": "<|reserved_special_token_129|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128138": {
+ "content": "<|reserved_special_token_130|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128139": {
+ "content": "<|reserved_special_token_131|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128140": {
+ "content": "<|reserved_special_token_132|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128141": {
+ "content": "<|reserved_special_token_133|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128142": {
+ "content": "<|reserved_special_token_134|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128143": {
+ "content": "<|reserved_special_token_135|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128144": {
+ "content": "<|reserved_special_token_136|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128145": {
+ "content": "<|reserved_special_token_137|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128146": {
+ "content": "<|reserved_special_token_138|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128147": {
+ "content": "<|reserved_special_token_139|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128148": {
+ "content": "<|reserved_special_token_140|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128149": {
+ "content": "<|reserved_special_token_141|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128150": {
+ "content": "<|reserved_special_token_142|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128151": {
+ "content": "<|reserved_special_token_143|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128152": {
+ "content": "<|reserved_special_token_144|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128153": {
+ "content": "<|reserved_special_token_145|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128154": {
+ "content": "<|reserved_special_token_146|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128155": {
+ "content": "<|reserved_special_token_147|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128156": {
+ "content": "<|reserved_special_token_148|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128157": {
+ "content": "<|reserved_special_token_149|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128158": {
+ "content": "<|reserved_special_token_150|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128159": {
+ "content": "<|reserved_special_token_151|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128160": {
+ "content": "<|reserved_special_token_152|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128161": {
+ "content": "<|reserved_special_token_153|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128162": {
+ "content": "<|reserved_special_token_154|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128163": {
+ "content": "<|reserved_special_token_155|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128164": {
+ "content": "<|reserved_special_token_156|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128165": {
+ "content": "<|reserved_special_token_157|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128166": {
+ "content": "<|reserved_special_token_158|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128167": {
+ "content": "<|reserved_special_token_159|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128168": {
+ "content": "<|reserved_special_token_160|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128169": {
+ "content": "<|reserved_special_token_161|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128170": {
+ "content": "<|reserved_special_token_162|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128171": {
+ "content": "<|reserved_special_token_163|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128172": {
+ "content": "<|reserved_special_token_164|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128173": {
+ "content": "<|reserved_special_token_165|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128174": {
+ "content": "<|reserved_special_token_166|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128175": {
+ "content": "<|reserved_special_token_167|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128176": {
+ "content": "<|reserved_special_token_168|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128177": {
+ "content": "<|reserved_special_token_169|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128178": {
+ "content": "<|reserved_special_token_170|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128179": {
+ "content": "<|reserved_special_token_171|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128180": {
+ "content": "<|reserved_special_token_172|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128181": {
+ "content": "<|reserved_special_token_173|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128182": {
+ "content": "<|reserved_special_token_174|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128183": {
+ "content": "<|reserved_special_token_175|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128184": {
+ "content": "<|reserved_special_token_176|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128185": {
+ "content": "<|reserved_special_token_177|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128186": {
+ "content": "<|reserved_special_token_178|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128187": {
+ "content": "<|reserved_special_token_179|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128188": {
+ "content": "<|reserved_special_token_180|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128189": {
+ "content": "<|reserved_special_token_181|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128190": {
+ "content": "<|reserved_special_token_182|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128191": {
+ "content": "<|reserved_special_token_183|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128192": {
+ "content": "<|reserved_special_token_184|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128193": {
+ "content": "<|reserved_special_token_185|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128194": {
+ "content": "<|reserved_special_token_186|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128195": {
+ "content": "<|reserved_special_token_187|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128196": {
+ "content": "<|reserved_special_token_188|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128197": {
+ "content": "<|reserved_special_token_189|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128198": {
+ "content": "<|reserved_special_token_190|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128199": {
+ "content": "<|reserved_special_token_191|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128200": {
+ "content": "<|reserved_special_token_192|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128201": {
+ "content": "<|reserved_special_token_193|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128202": {
+ "content": "<|reserved_special_token_194|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128203": {
+ "content": "<|reserved_special_token_195|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128204": {
+ "content": "<|reserved_special_token_196|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128205": {
+ "content": "<|reserved_special_token_197|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128206": {
+ "content": "<|reserved_special_token_198|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128207": {
+ "content": "<|reserved_special_token_199|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128208": {
+ "content": "<|reserved_special_token_200|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128209": {
+ "content": "<|reserved_special_token_201|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128210": {
+ "content": "<|reserved_special_token_202|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128211": {
+ "content": "<|reserved_special_token_203|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128212": {
+ "content": "<|reserved_special_token_204|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128213": {
+ "content": "<|reserved_special_token_205|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128214": {
+ "content": "<|reserved_special_token_206|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128215": {
+ "content": "<|reserved_special_token_207|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128216": {
+ "content": "<|reserved_special_token_208|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128217": {
+ "content": "<|reserved_special_token_209|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128218": {
+ "content": "<|reserved_special_token_210|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128219": {
+ "content": "<|reserved_special_token_211|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128220": {
+ "content": "<|reserved_special_token_212|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128221": {
+ "content": "<|reserved_special_token_213|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128222": {
+ "content": "<|reserved_special_token_214|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128223": {
+ "content": "<|reserved_special_token_215|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128224": {
+ "content": "<|reserved_special_token_216|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128225": {
+ "content": "<|reserved_special_token_217|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128226": {
+ "content": "<|reserved_special_token_218|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128227": {
+ "content": "<|reserved_special_token_219|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128228": {
+ "content": "<|reserved_special_token_220|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128229": {
+ "content": "<|reserved_special_token_221|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128230": {
+ "content": "<|reserved_special_token_222|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128231": {
+ "content": "<|reserved_special_token_223|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128232": {
+ "content": "<|reserved_special_token_224|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128233": {
+ "content": "<|reserved_special_token_225|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128234": {
+ "content": "<|reserved_special_token_226|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128235": {
+ "content": "<|reserved_special_token_227|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128236": {
+ "content": "<|reserved_special_token_228|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128237": {
+ "content": "<|reserved_special_token_229|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128238": {
+ "content": "<|reserved_special_token_230|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128239": {
+ "content": "<|reserved_special_token_231|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128240": {
+ "content": "<|reserved_special_token_232|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128241": {
+ "content": "<|reserved_special_token_233|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128242": {
+ "content": "<|reserved_special_token_234|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128243": {
+ "content": "<|reserved_special_token_235|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128244": {
+ "content": "<|reserved_special_token_236|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128245": {
+ "content": "<|reserved_special_token_237|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128246": {
+ "content": "<|reserved_special_token_238|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128247": {
+ "content": "<|reserved_special_token_239|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128248": {
+ "content": "<|reserved_special_token_240|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128249": {
+ "content": "<|reserved_special_token_241|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128250": {
+ "content": "<|reserved_special_token_242|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128251": {
+ "content": "<|reserved_special_token_243|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128252": {
+ "content": "<|reserved_special_token_244|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128253": {
+ "content": "<|reserved_special_token_245|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128254": {
+ "content": "<|reserved_special_token_246|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128255": {
+ "content": "<|reserved_special_token_247|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ }
+ },
+ "bos_token": "<|begin_of_text|>",
+ "chat_template": "{{- bos_token }}\n{%- if custom_tools is defined %}\n {%- set tools = custom_tools %}\n{%- endif %}\n{%- if not tools_in_user_message is defined %}\n {%- set tools_in_user_message = true %}\n{%- endif %}\n{%- if not date_string is defined %}\n {%- set date_string = \"26 Jul 2024\" %}\n{%- endif %}\n{%- if not tools is defined %}\n {%- set tools = none %}\n{%- endif %}\n\n{#- This block extracts the system message, so we can slot it into the right place. #}\n{%- if messages[0]['role'] == 'system' %}\n {%- set system_message = messages[0]['content']|trim %}\n {%- set messages = messages[1:] %}\n{%- else %}\n {%- set system_message = \"\" %}\n{%- endif %}\n\n{#- System message + builtin tools #}\n{{- \"<|start_header_id|>system<|end_header_id|>\\n\\n\" }}\n{%- if builtin_tools is defined or tools is not none %}\n {{- \"Environment: ipython\\n\" }}\n{%- endif %}\n{%- if builtin_tools is defined %}\n {{- \"Tools: \" + builtin_tools | reject('equalto', 'code_interpreter') | join(\", \") + \"\\n\\n\"}}\n{%- endif %}\n{{- \"Cutting Knowledge Date: December 2023\\n\" }}\n{{- \"Today Date: \" + date_string + \"\\n\\n\" }}\n{%- if tools is not none and not tools_in_user_message %}\n {{- \"You have access to the following functions. To call a function, please respond with JSON for a function call.\" }}\n {{- 'Respond in the format {\"name\": function name, \"parameters\": dictionary of argument name and its value}.' }}\n {{- \"Do not use variables.\\n\\n\" }}\n {%- for t in tools %}\n {{- t | tojson(indent=4) }}\n {{- \"\\n\\n\" }}\n {%- endfor %}\n{%- endif %}\n{{- system_message }}\n{{- \"<|eot_id|>\" }}\n\n{#- Custom tools are passed in a user message with some extra guidance #}\n{%- if tools_in_user_message and not tools is none %}\n {#- Extract the first user message so we can plug it in here #}\n {%- if messages | length != 0 %}\n {%- set first_user_message = messages[0]['content']|trim %}\n {%- set messages = messages[1:] %}\n {%- else %}\n {{- raise_exception(\"Cannot put tools in the first user message when there's no first user message!\") }}\n{%- endif %}\n {{- '<|start_header_id|>user<|end_header_id|>\\n\\n' -}}\n {{- \"Given the following functions, please respond with a JSON for a function call \" }}\n {{- \"with its proper arguments that best answers the given prompt.\\n\\n\" }}\n {{- 'Respond in the format {\"name\": function name, \"parameters\": dictionary of argument name and its value}.' }}\n {{- \"Do not use variables.\\n\\n\" }}\n {%- for t in tools %}\n {{- t | tojson(indent=4) }}\n {{- \"\\n\\n\" }}\n {%- endfor %}\n {{- first_user_message + \"<|eot_id|>\"}}\n{%- endif %}\n\n{%- for message in messages %}\n {%- if not (message.role == 'ipython' or message.role == 'tool' or 'tool_calls' in message) %}\n {{- '<|start_header_id|>' + message['role'] + '<|end_header_id|>\\n\\n'+ message['content'] | trim + '<|eot_id|>' }}\n {%- elif 'tool_calls' in message %}\n {%- if not message.tool_calls|length == 1 %}\n {{- raise_exception(\"This model only supports single tool-calls at once!\") }}\n {%- endif %}\n {%- set tool_call = message.tool_calls[0].function %}\n {%- if builtin_tools is defined and tool_call.name in builtin_tools %}\n {{- '<|start_header_id|>assistant<|end_header_id|>\\n\\n' -}}\n {{- \"<|python_tag|>\" + tool_call.name + \".call(\" }}\n {%- for arg_name, arg_val in tool_call.arguments | items %}\n {{- arg_name + '=\"' + arg_val + '\"' }}\n {%- if not loop.last %}\n {{- \", \" }}\n {%- endif %}\n {%- endfor %}\n {{- \")\" }}\n {%- else %}\n {{- '<|start_header_id|>assistant<|end_header_id|>\\n\\n' -}}\n {{- '{\"name\": \"' + tool_call.name + '\", ' }}\n {{- '\"parameters\": ' }}\n {{- tool_call.arguments | tojson }}\n {{- \"}\" }}\n {%- endif %}\n {%- if builtin_tools is defined %}\n {#- This means we're in ipython mode #}\n {{- \"<|eom_id|>\" }}\n {%- else %}\n {{- \"<|eot_id|>\" }}\n {%- endif %}\n {%- elif message.role == \"tool\" or message.role == \"ipython\" %}\n {{- \"<|start_header_id|>ipython<|end_header_id|>\\n\\n\" }}\n {%- if message.content is mapping or message.content is iterable %}\n {{- message.content | tojson }}\n {%- else %}\n {{- message.content }}\n {%- endif %}\n {{- \"<|eot_id|>\" }}\n {%- endif %}\n{%- endfor %}\n{%- if add_generation_prompt %}\n {{- '<|start_header_id|>assistant<|end_header_id|>\\n\\n' }}\n{%- endif %}\n",
+ "clean_up_tokenization_spaces": true,
+ "eos_token": "<|eot_id|>",
+ "extra_special_tokens": {},
+ "model_input_names": [
+ "input_ids",
+ "attention_mask"
+ ],
+ "model_max_length": 131072,
+ "pad_token": "<|end_of_text|>",
+ "tokenizer_class": "PreTrainedTokenizer"
+}
diff --git a/checkpoint-255/trainer_state.json b/checkpoint-255/trainer_state.json
new file mode 100644
index 0000000000000000000000000000000000000000..5d9bb4e0e7fdf24f3f8d77ae1d6a479cb7b16e3f
--- /dev/null
+++ b/checkpoint-255/trainer_state.json
@@ -0,0 +1,1818 @@
+{
+ "best_metric": null,
+ "best_model_checkpoint": null,
+ "epoch": 2.97265625,
+ "eval_steps": 500,
+ "global_step": 255,
+ "is_hyper_param_search": false,
+ "is_local_process_zero": true,
+ "is_world_process_zero": true,
+ "log_history": [
+ {
+ "epoch": 0.01171875,
+ "grad_norm": 36.23282241821289,
+ "learning_rate": 5.0000000000000004e-08,
+ "loss": 2.3839,
+ "step": 1
+ },
+ {
+ "epoch": 0.0234375,
+ "grad_norm": 35.918636322021484,
+ "learning_rate": 1.0000000000000001e-07,
+ "loss": 2.3798,
+ "step": 2
+ },
+ {
+ "epoch": 0.03515625,
+ "grad_norm": 35.62618637084961,
+ "learning_rate": 1.5000000000000002e-07,
+ "loss": 2.386,
+ "step": 3
+ },
+ {
+ "epoch": 0.046875,
+ "grad_norm": 35.966087341308594,
+ "learning_rate": 2.0000000000000002e-07,
+ "loss": 2.3803,
+ "step": 4
+ },
+ {
+ "epoch": 0.05859375,
+ "grad_norm": 35.38177490234375,
+ "learning_rate": 2.5000000000000004e-07,
+ "loss": 2.3937,
+ "step": 5
+ },
+ {
+ "epoch": 0.0703125,
+ "grad_norm": 35.99677658081055,
+ "learning_rate": 3.0000000000000004e-07,
+ "loss": 2.3906,
+ "step": 6
+ },
+ {
+ "epoch": 0.08203125,
+ "grad_norm": 35.44341278076172,
+ "learning_rate": 3.5000000000000004e-07,
+ "loss": 2.3539,
+ "step": 7
+ },
+ {
+ "epoch": 0.09375,
+ "grad_norm": 35.300697326660156,
+ "learning_rate": 4.0000000000000003e-07,
+ "loss": 2.3459,
+ "step": 8
+ },
+ {
+ "epoch": 0.10546875,
+ "grad_norm": 34.092952728271484,
+ "learning_rate": 4.5000000000000003e-07,
+ "loss": 2.2959,
+ "step": 9
+ },
+ {
+ "epoch": 0.1171875,
+ "grad_norm": 34.46371841430664,
+ "learning_rate": 5.000000000000001e-07,
+ "loss": 2.2661,
+ "step": 10
+ },
+ {
+ "epoch": 0.12890625,
+ "grad_norm": 34.62260818481445,
+ "learning_rate": 5.5e-07,
+ "loss": 2.2918,
+ "step": 11
+ },
+ {
+ "epoch": 0.140625,
+ "grad_norm": 33.790374755859375,
+ "learning_rate": 6.000000000000001e-07,
+ "loss": 2.223,
+ "step": 12
+ },
+ {
+ "epoch": 0.15234375,
+ "grad_norm": 33.766536712646484,
+ "learning_rate": 6.5e-07,
+ "loss": 2.2267,
+ "step": 13
+ },
+ {
+ "epoch": 0.1640625,
+ "grad_norm": 33.894081115722656,
+ "learning_rate": 7.000000000000001e-07,
+ "loss": 2.1465,
+ "step": 14
+ },
+ {
+ "epoch": 0.17578125,
+ "grad_norm": 33.162452697753906,
+ "learning_rate": 7.5e-07,
+ "loss": 2.0495,
+ "step": 15
+ },
+ {
+ "epoch": 0.1875,
+ "grad_norm": 32.954341888427734,
+ "learning_rate": 8.000000000000001e-07,
+ "loss": 1.9627,
+ "step": 16
+ },
+ {
+ "epoch": 0.19921875,
+ "grad_norm": 33.96324157714844,
+ "learning_rate": 8.500000000000001e-07,
+ "loss": 1.8867,
+ "step": 17
+ },
+ {
+ "epoch": 0.2109375,
+ "grad_norm": 33.81139373779297,
+ "learning_rate": 9.000000000000001e-07,
+ "loss": 1.7752,
+ "step": 18
+ },
+ {
+ "epoch": 0.22265625,
+ "grad_norm": 34.87086868286133,
+ "learning_rate": 9.500000000000001e-07,
+ "loss": 1.6944,
+ "step": 19
+ },
+ {
+ "epoch": 0.234375,
+ "grad_norm": 34.84965133666992,
+ "learning_rate": 1.0000000000000002e-06,
+ "loss": 1.5707,
+ "step": 20
+ },
+ {
+ "epoch": 0.24609375,
+ "grad_norm": 35.227317810058594,
+ "learning_rate": 1.0500000000000001e-06,
+ "loss": 1.4369,
+ "step": 21
+ },
+ {
+ "epoch": 0.2578125,
+ "grad_norm": 34.91344451904297,
+ "learning_rate": 1.1e-06,
+ "loss": 1.3202,
+ "step": 22
+ },
+ {
+ "epoch": 0.26953125,
+ "grad_norm": 31.7376766204834,
+ "learning_rate": 1.1500000000000002e-06,
+ "loss": 1.1398,
+ "step": 23
+ },
+ {
+ "epoch": 0.28125,
+ "grad_norm": 30.24741554260254,
+ "learning_rate": 1.2000000000000002e-06,
+ "loss": 1.0421,
+ "step": 24
+ },
+ {
+ "epoch": 0.29296875,
+ "grad_norm": 28.292400360107422,
+ "learning_rate": 1.25e-06,
+ "loss": 0.8817,
+ "step": 25
+ },
+ {
+ "epoch": 0.3046875,
+ "grad_norm": 30.44672393798828,
+ "learning_rate": 1.3e-06,
+ "loss": 0.7073,
+ "step": 26
+ },
+ {
+ "epoch": 0.31640625,
+ "grad_norm": 29.416427612304688,
+ "learning_rate": 1.3500000000000002e-06,
+ "loss": 0.5444,
+ "step": 27
+ },
+ {
+ "epoch": 0.328125,
+ "grad_norm": 24.820096969604492,
+ "learning_rate": 1.4000000000000001e-06,
+ "loss": 0.4025,
+ "step": 28
+ },
+ {
+ "epoch": 0.33984375,
+ "grad_norm": 21.023277282714844,
+ "learning_rate": 1.45e-06,
+ "loss": 0.307,
+ "step": 29
+ },
+ {
+ "epoch": 0.3515625,
+ "grad_norm": 19.656967163085938,
+ "learning_rate": 1.5e-06,
+ "loss": 0.2151,
+ "step": 30
+ },
+ {
+ "epoch": 0.36328125,
+ "grad_norm": 14.91929817199707,
+ "learning_rate": 1.5500000000000002e-06,
+ "loss": 0.1448,
+ "step": 31
+ },
+ {
+ "epoch": 0.375,
+ "grad_norm": 5.083199977874756,
+ "learning_rate": 1.6000000000000001e-06,
+ "loss": 0.09,
+ "step": 32
+ },
+ {
+ "epoch": 0.38671875,
+ "grad_norm": 2.320681571960449,
+ "learning_rate": 1.6500000000000003e-06,
+ "loss": 0.0641,
+ "step": 33
+ },
+ {
+ "epoch": 0.3984375,
+ "grad_norm": 1.6233159303665161,
+ "learning_rate": 1.7000000000000002e-06,
+ "loss": 0.0584,
+ "step": 34
+ },
+ {
+ "epoch": 0.41015625,
+ "grad_norm": 1.6057201623916626,
+ "learning_rate": 1.75e-06,
+ "loss": 0.0626,
+ "step": 35
+ },
+ {
+ "epoch": 0.421875,
+ "grad_norm": 1.8360320329666138,
+ "learning_rate": 1.8000000000000001e-06,
+ "loss": 0.0563,
+ "step": 36
+ },
+ {
+ "epoch": 0.43359375,
+ "grad_norm": 1.736350178718567,
+ "learning_rate": 1.85e-06,
+ "loss": 0.0609,
+ "step": 37
+ },
+ {
+ "epoch": 0.4453125,
+ "grad_norm": 1.1473922729492188,
+ "learning_rate": 1.9000000000000002e-06,
+ "loss": 0.0541,
+ "step": 38
+ },
+ {
+ "epoch": 0.45703125,
+ "grad_norm": 1.1722168922424316,
+ "learning_rate": 1.9500000000000004e-06,
+ "loss": 0.0534,
+ "step": 39
+ },
+ {
+ "epoch": 0.46875,
+ "grad_norm": 1.356987714767456,
+ "learning_rate": 2.0000000000000003e-06,
+ "loss": 0.0496,
+ "step": 40
+ },
+ {
+ "epoch": 0.48046875,
+ "grad_norm": 0.8023216724395752,
+ "learning_rate": 2.05e-06,
+ "loss": 0.0527,
+ "step": 41
+ },
+ {
+ "epoch": 0.4921875,
+ "grad_norm": 0.9803515672683716,
+ "learning_rate": 2.1000000000000002e-06,
+ "loss": 0.0478,
+ "step": 42
+ },
+ {
+ "epoch": 0.50390625,
+ "grad_norm": 0.8733468651771545,
+ "learning_rate": 2.15e-06,
+ "loss": 0.052,
+ "step": 43
+ },
+ {
+ "epoch": 0.515625,
+ "grad_norm": 0.8213743567466736,
+ "learning_rate": 2.2e-06,
+ "loss": 0.0448,
+ "step": 44
+ },
+ {
+ "epoch": 0.52734375,
+ "grad_norm": 0.843189537525177,
+ "learning_rate": 2.25e-06,
+ "loss": 0.0498,
+ "step": 45
+ },
+ {
+ "epoch": 0.5390625,
+ "grad_norm": 0.8801079392433167,
+ "learning_rate": 2.3000000000000004e-06,
+ "loss": 0.0408,
+ "step": 46
+ },
+ {
+ "epoch": 0.55078125,
+ "grad_norm": 0.7131401300430298,
+ "learning_rate": 2.35e-06,
+ "loss": 0.0405,
+ "step": 47
+ },
+ {
+ "epoch": 0.5625,
+ "grad_norm": 0.8996126651763916,
+ "learning_rate": 2.4000000000000003e-06,
+ "loss": 0.0525,
+ "step": 48
+ },
+ {
+ "epoch": 0.57421875,
+ "grad_norm": 0.8606986403465271,
+ "learning_rate": 2.4500000000000003e-06,
+ "loss": 0.0438,
+ "step": 49
+ },
+ {
+ "epoch": 0.5859375,
+ "grad_norm": 0.6918051838874817,
+ "learning_rate": 2.5e-06,
+ "loss": 0.0394,
+ "step": 50
+ },
+ {
+ "epoch": 0.59765625,
+ "grad_norm": 0.6177802085876465,
+ "learning_rate": 2.55e-06,
+ "loss": 0.0387,
+ "step": 51
+ },
+ {
+ "epoch": 0.609375,
+ "grad_norm": 0.7042555809020996,
+ "learning_rate": 2.6e-06,
+ "loss": 0.0434,
+ "step": 52
+ },
+ {
+ "epoch": 0.62109375,
+ "grad_norm": 0.6537717580795288,
+ "learning_rate": 2.6500000000000005e-06,
+ "loss": 0.0396,
+ "step": 53
+ },
+ {
+ "epoch": 0.6328125,
+ "grad_norm": 0.7834082841873169,
+ "learning_rate": 2.7000000000000004e-06,
+ "loss": 0.0411,
+ "step": 54
+ },
+ {
+ "epoch": 0.64453125,
+ "grad_norm": 0.7287272810935974,
+ "learning_rate": 2.7500000000000004e-06,
+ "loss": 0.0408,
+ "step": 55
+ },
+ {
+ "epoch": 0.65625,
+ "grad_norm": 0.7186263203620911,
+ "learning_rate": 2.8000000000000003e-06,
+ "loss": 0.0394,
+ "step": 56
+ },
+ {
+ "epoch": 0.66796875,
+ "grad_norm": 0.7264899611473083,
+ "learning_rate": 2.85e-06,
+ "loss": 0.0427,
+ "step": 57
+ },
+ {
+ "epoch": 0.6796875,
+ "grad_norm": 0.7665618062019348,
+ "learning_rate": 2.9e-06,
+ "loss": 0.0368,
+ "step": 58
+ },
+ {
+ "epoch": 0.69140625,
+ "grad_norm": 0.7222962379455566,
+ "learning_rate": 2.95e-06,
+ "loss": 0.0412,
+ "step": 59
+ },
+ {
+ "epoch": 0.703125,
+ "grad_norm": 0.7061101794242859,
+ "learning_rate": 3e-06,
+ "loss": 0.0377,
+ "step": 60
+ },
+ {
+ "epoch": 0.71484375,
+ "grad_norm": 0.5724324584007263,
+ "learning_rate": 3.05e-06,
+ "loss": 0.0387,
+ "step": 61
+ },
+ {
+ "epoch": 0.7265625,
+ "grad_norm": 0.5535506010055542,
+ "learning_rate": 3.1000000000000004e-06,
+ "loss": 0.0403,
+ "step": 62
+ },
+ {
+ "epoch": 0.73828125,
+ "grad_norm": 0.6553678512573242,
+ "learning_rate": 3.1500000000000003e-06,
+ "loss": 0.0415,
+ "step": 63
+ },
+ {
+ "epoch": 0.75,
+ "grad_norm": 0.6137285828590393,
+ "learning_rate": 3.2000000000000003e-06,
+ "loss": 0.0383,
+ "step": 64
+ },
+ {
+ "epoch": 0.76171875,
+ "grad_norm": 0.5985754132270813,
+ "learning_rate": 3.2500000000000002e-06,
+ "loss": 0.0355,
+ "step": 65
+ },
+ {
+ "epoch": 0.7734375,
+ "grad_norm": 0.5903909802436829,
+ "learning_rate": 3.3000000000000006e-06,
+ "loss": 0.0374,
+ "step": 66
+ },
+ {
+ "epoch": 0.78515625,
+ "grad_norm": 0.5718765258789062,
+ "learning_rate": 3.3500000000000005e-06,
+ "loss": 0.0339,
+ "step": 67
+ },
+ {
+ "epoch": 0.796875,
+ "grad_norm": 0.6844965815544128,
+ "learning_rate": 3.4000000000000005e-06,
+ "loss": 0.0405,
+ "step": 68
+ },
+ {
+ "epoch": 0.80859375,
+ "grad_norm": 0.5959618091583252,
+ "learning_rate": 3.45e-06,
+ "loss": 0.0338,
+ "step": 69
+ },
+ {
+ "epoch": 0.8203125,
+ "grad_norm": 0.6095123291015625,
+ "learning_rate": 3.5e-06,
+ "loss": 0.0362,
+ "step": 70
+ },
+ {
+ "epoch": 0.83203125,
+ "grad_norm": 0.543708086013794,
+ "learning_rate": 3.5500000000000003e-06,
+ "loss": 0.0355,
+ "step": 71
+ },
+ {
+ "epoch": 0.84375,
+ "grad_norm": 0.6969983577728271,
+ "learning_rate": 3.6000000000000003e-06,
+ "loss": 0.0325,
+ "step": 72
+ },
+ {
+ "epoch": 0.85546875,
+ "grad_norm": 0.6022969484329224,
+ "learning_rate": 3.65e-06,
+ "loss": 0.0342,
+ "step": 73
+ },
+ {
+ "epoch": 0.8671875,
+ "grad_norm": 0.6262147426605225,
+ "learning_rate": 3.7e-06,
+ "loss": 0.0348,
+ "step": 74
+ },
+ {
+ "epoch": 0.87890625,
+ "grad_norm": 0.5729933381080627,
+ "learning_rate": 3.7500000000000005e-06,
+ "loss": 0.0318,
+ "step": 75
+ },
+ {
+ "epoch": 0.890625,
+ "grad_norm": 0.5846775770187378,
+ "learning_rate": 3.8000000000000005e-06,
+ "loss": 0.0309,
+ "step": 76
+ },
+ {
+ "epoch": 0.90234375,
+ "grad_norm": 0.6469219923019409,
+ "learning_rate": 3.85e-06,
+ "loss": 0.0324,
+ "step": 77
+ },
+ {
+ "epoch": 0.9140625,
+ "grad_norm": 0.6574859023094177,
+ "learning_rate": 3.900000000000001e-06,
+ "loss": 0.0325,
+ "step": 78
+ },
+ {
+ "epoch": 0.92578125,
+ "grad_norm": 0.5833832025527954,
+ "learning_rate": 3.95e-06,
+ "loss": 0.0232,
+ "step": 79
+ },
+ {
+ "epoch": 0.9375,
+ "grad_norm": 0.7503570318222046,
+ "learning_rate": 4.000000000000001e-06,
+ "loss": 0.0267,
+ "step": 80
+ },
+ {
+ "epoch": 0.94921875,
+ "grad_norm": 0.7181633114814758,
+ "learning_rate": 4.05e-06,
+ "loss": 0.0304,
+ "step": 81
+ },
+ {
+ "epoch": 0.9609375,
+ "grad_norm": 0.6477274298667908,
+ "learning_rate": 4.1e-06,
+ "loss": 0.0297,
+ "step": 82
+ },
+ {
+ "epoch": 0.97265625,
+ "grad_norm": 0.6768563389778137,
+ "learning_rate": 4.15e-06,
+ "loss": 0.0279,
+ "step": 83
+ },
+ {
+ "epoch": 0.984375,
+ "grad_norm": 0.7905837297439575,
+ "learning_rate": 4.2000000000000004e-06,
+ "loss": 0.0301,
+ "step": 84
+ },
+ {
+ "epoch": 0.99609375,
+ "grad_norm": 0.5576608777046204,
+ "learning_rate": 4.25e-06,
+ "loss": 0.0322,
+ "step": 85
+ },
+ {
+ "epoch": 1.0,
+ "grad_norm": 0.5576608777046204,
+ "learning_rate": 4.3e-06,
+ "loss": 0.0226,
+ "step": 86
+ },
+ {
+ "epoch": 1.01171875,
+ "grad_norm": 1.0774812698364258,
+ "learning_rate": 4.350000000000001e-06,
+ "loss": 0.0215,
+ "step": 87
+ },
+ {
+ "epoch": 1.0234375,
+ "grad_norm": 0.47373324632644653,
+ "learning_rate": 4.4e-06,
+ "loss": 0.0235,
+ "step": 88
+ },
+ {
+ "epoch": 1.03515625,
+ "grad_norm": 0.7665970325469971,
+ "learning_rate": 4.450000000000001e-06,
+ "loss": 0.0242,
+ "step": 89
+ },
+ {
+ "epoch": 1.046875,
+ "grad_norm": 0.6290147304534912,
+ "learning_rate": 4.5e-06,
+ "loss": 0.0209,
+ "step": 90
+ },
+ {
+ "epoch": 1.05859375,
+ "grad_norm": 0.5703024864196777,
+ "learning_rate": 4.5500000000000005e-06,
+ "loss": 0.0192,
+ "step": 91
+ },
+ {
+ "epoch": 1.0703125,
+ "grad_norm": 0.6099259853363037,
+ "learning_rate": 4.600000000000001e-06,
+ "loss": 0.0181,
+ "step": 92
+ },
+ {
+ "epoch": 1.08203125,
+ "grad_norm": 0.6570988297462463,
+ "learning_rate": 4.65e-06,
+ "loss": 0.0201,
+ "step": 93
+ },
+ {
+ "epoch": 1.09375,
+ "grad_norm": 0.7848325371742249,
+ "learning_rate": 4.7e-06,
+ "loss": 0.0253,
+ "step": 94
+ },
+ {
+ "epoch": 1.10546875,
+ "grad_norm": 0.6759209036827087,
+ "learning_rate": 4.75e-06,
+ "loss": 0.0195,
+ "step": 95
+ },
+ {
+ "epoch": 1.1171875,
+ "grad_norm": 0.4861151874065399,
+ "learning_rate": 4.800000000000001e-06,
+ "loss": 0.0191,
+ "step": 96
+ },
+ {
+ "epoch": 1.12890625,
+ "grad_norm": 0.6268576383590698,
+ "learning_rate": 4.85e-06,
+ "loss": 0.0211,
+ "step": 97
+ },
+ {
+ "epoch": 1.140625,
+ "grad_norm": 0.5862017869949341,
+ "learning_rate": 4.9000000000000005e-06,
+ "loss": 0.0177,
+ "step": 98
+ },
+ {
+ "epoch": 1.15234375,
+ "grad_norm": 0.4569724202156067,
+ "learning_rate": 4.95e-06,
+ "loss": 0.0164,
+ "step": 99
+ },
+ {
+ "epoch": 1.1640625,
+ "grad_norm": 0.4539048969745636,
+ "learning_rate": 5e-06,
+ "loss": 0.0152,
+ "step": 100
+ },
+ {
+ "epoch": 1.17578125,
+ "grad_norm": 0.4553528428077698,
+ "learning_rate": 4.999926609487568e-06,
+ "loss": 0.0208,
+ "step": 101
+ },
+ {
+ "epoch": 1.1875,
+ "grad_norm": 0.5182592272758484,
+ "learning_rate": 4.999706442259205e-06,
+ "loss": 0.0154,
+ "step": 102
+ },
+ {
+ "epoch": 1.19921875,
+ "grad_norm": 0.5602673888206482,
+ "learning_rate": 4.999339511241458e-06,
+ "loss": 0.0196,
+ "step": 103
+ },
+ {
+ "epoch": 1.2109375,
+ "grad_norm": 0.7579494118690491,
+ "learning_rate": 4.9988258379777334e-06,
+ "loss": 0.0198,
+ "step": 104
+ },
+ {
+ "epoch": 1.22265625,
+ "grad_norm": 0.603757381439209,
+ "learning_rate": 4.998165452627025e-06,
+ "loss": 0.0185,
+ "step": 105
+ },
+ {
+ "epoch": 1.234375,
+ "grad_norm": 0.5520291924476624,
+ "learning_rate": 4.99735839396215e-06,
+ "loss": 0.018,
+ "step": 106
+ },
+ {
+ "epoch": 1.24609375,
+ "grad_norm": 0.55808424949646,
+ "learning_rate": 4.996404709367466e-06,
+ "loss": 0.0159,
+ "step": 107
+ },
+ {
+ "epoch": 1.2578125,
+ "grad_norm": 0.47174298763275146,
+ "learning_rate": 4.995304454836095e-06,
+ "loss": 0.0122,
+ "step": 108
+ },
+ {
+ "epoch": 1.26953125,
+ "grad_norm": 0.5289337038993835,
+ "learning_rate": 4.994057694966632e-06,
+ "loss": 0.0168,
+ "step": 109
+ },
+ {
+ "epoch": 1.28125,
+ "grad_norm": 0.5390430092811584,
+ "learning_rate": 4.992664502959351e-06,
+ "loss": 0.017,
+ "step": 110
+ },
+ {
+ "epoch": 1.29296875,
+ "grad_norm": 0.4966451823711395,
+ "learning_rate": 4.991124960611916e-06,
+ "loss": 0.0145,
+ "step": 111
+ },
+ {
+ "epoch": 1.3046875,
+ "grad_norm": 0.6148604154586792,
+ "learning_rate": 4.989439158314566e-06,
+ "loss": 0.0139,
+ "step": 112
+ },
+ {
+ "epoch": 1.31640625,
+ "grad_norm": 0.6303534507751465,
+ "learning_rate": 4.9876071950448185e-06,
+ "loss": 0.0118,
+ "step": 113
+ },
+ {
+ "epoch": 1.328125,
+ "grad_norm": 0.5410207509994507,
+ "learning_rate": 4.98562917836165e-06,
+ "loss": 0.0094,
+ "step": 114
+ },
+ {
+ "epoch": 1.33984375,
+ "grad_norm": 0.5350080132484436,
+ "learning_rate": 4.983505224399188e-06,
+ "loss": 0.0158,
+ "step": 115
+ },
+ {
+ "epoch": 1.3515625,
+ "grad_norm": 1.017317295074463,
+ "learning_rate": 4.9812354578598876e-06,
+ "loss": 0.0201,
+ "step": 116
+ },
+ {
+ "epoch": 1.36328125,
+ "grad_norm": 0.6891007423400879,
+ "learning_rate": 4.978820012007213e-06,
+ "loss": 0.0127,
+ "step": 117
+ },
+ {
+ "epoch": 1.375,
+ "grad_norm": 0.4756389260292053,
+ "learning_rate": 4.976259028657812e-06,
+ "loss": 0.0188,
+ "step": 118
+ },
+ {
+ "epoch": 1.38671875,
+ "grad_norm": 0.5957350730895996,
+ "learning_rate": 4.973552658173186e-06,
+ "loss": 0.011,
+ "step": 119
+ },
+ {
+ "epoch": 1.3984375,
+ "grad_norm": 0.5012223720550537,
+ "learning_rate": 4.970701059450872e-06,
+ "loss": 0.0138,
+ "step": 120
+ },
+ {
+ "epoch": 1.41015625,
+ "grad_norm": 0.4408419132232666,
+ "learning_rate": 4.9677043999151e-06,
+ "loss": 0.0144,
+ "step": 121
+ },
+ {
+ "epoch": 1.421875,
+ "grad_norm": 0.5721736550331116,
+ "learning_rate": 4.964562855506976e-06,
+ "loss": 0.0135,
+ "step": 122
+ },
+ {
+ "epoch": 1.43359375,
+ "grad_norm": 0.5479208827018738,
+ "learning_rate": 4.961276610674141e-06,
+ "loss": 0.0128,
+ "step": 123
+ },
+ {
+ "epoch": 1.4453125,
+ "grad_norm": 1.0117675065994263,
+ "learning_rate": 4.9578458583599495e-06,
+ "loss": 0.0111,
+ "step": 124
+ },
+ {
+ "epoch": 1.45703125,
+ "grad_norm": 0.5504026412963867,
+ "learning_rate": 4.954270799992138e-06,
+ "loss": 0.0083,
+ "step": 125
+ },
+ {
+ "epoch": 1.46875,
+ "grad_norm": 0.48403099179267883,
+ "learning_rate": 4.950551645470998e-06,
+ "loss": 0.0083,
+ "step": 126
+ },
+ {
+ "epoch": 1.48046875,
+ "grad_norm": 0.6866800785064697,
+ "learning_rate": 4.9466886131570565e-06,
+ "loss": 0.0085,
+ "step": 127
+ },
+ {
+ "epoch": 1.4921875,
+ "grad_norm": 0.872557520866394,
+ "learning_rate": 4.942681929858249e-06,
+ "loss": 0.0102,
+ "step": 128
+ },
+ {
+ "epoch": 1.50390625,
+ "grad_norm": 0.6924716234207153,
+ "learning_rate": 4.9385318308166065e-06,
+ "loss": 0.012,
+ "step": 129
+ },
+ {
+ "epoch": 1.515625,
+ "grad_norm": 0.5060118436813354,
+ "learning_rate": 4.934238559694448e-06,
+ "loss": 0.0084,
+ "step": 130
+ },
+ {
+ "epoch": 1.52734375,
+ "grad_norm": 0.6256171464920044,
+ "learning_rate": 4.929802368560066e-06,
+ "loss": 0.0081,
+ "step": 131
+ },
+ {
+ "epoch": 1.5390625,
+ "grad_norm": 0.5422537922859192,
+ "learning_rate": 4.925223517872934e-06,
+ "loss": 0.0077,
+ "step": 132
+ },
+ {
+ "epoch": 1.55078125,
+ "grad_norm": 0.953416109085083,
+ "learning_rate": 4.920502276468408e-06,
+ "loss": 0.0078,
+ "step": 133
+ },
+ {
+ "epoch": 1.5625,
+ "grad_norm": 0.4540804624557495,
+ "learning_rate": 4.915638921541952e-06,
+ "loss": 0.0097,
+ "step": 134
+ },
+ {
+ "epoch": 1.57421875,
+ "grad_norm": 0.3773641884326935,
+ "learning_rate": 4.9106337386328524e-06,
+ "loss": 0.0098,
+ "step": 135
+ },
+ {
+ "epoch": 1.5859375,
+ "grad_norm": 0.7970175743103027,
+ "learning_rate": 4.905487021607462e-06,
+ "loss": 0.0056,
+ "step": 136
+ },
+ {
+ "epoch": 1.59765625,
+ "grad_norm": 0.45197635889053345,
+ "learning_rate": 4.900199072641937e-06,
+ "loss": 0.0078,
+ "step": 137
+ },
+ {
+ "epoch": 1.609375,
+ "grad_norm": 0.38231438398361206,
+ "learning_rate": 4.894770202204509e-06,
+ "loss": 0.0072,
+ "step": 138
+ },
+ {
+ "epoch": 1.62109375,
+ "grad_norm": 0.2945426404476166,
+ "learning_rate": 4.889200729037241e-06,
+ "loss": 0.0086,
+ "step": 139
+ },
+ {
+ "epoch": 1.6328125,
+ "grad_norm": 0.49699363112449646,
+ "learning_rate": 4.883490980137327e-06,
+ "loss": 0.0073,
+ "step": 140
+ },
+ {
+ "epoch": 1.64453125,
+ "grad_norm": 0.38112956285476685,
+ "learning_rate": 4.8776412907378845e-06,
+ "loss": 0.0056,
+ "step": 141
+ },
+ {
+ "epoch": 1.65625,
+ "grad_norm": 0.46780407428741455,
+ "learning_rate": 4.871652004288275e-06,
+ "loss": 0.0078,
+ "step": 142
+ },
+ {
+ "epoch": 1.66796875,
+ "grad_norm": 0.43764325976371765,
+ "learning_rate": 4.865523472433942e-06,
+ "loss": 0.005,
+ "step": 143
+ },
+ {
+ "epoch": 1.6796875,
+ "grad_norm": 0.3445664644241333,
+ "learning_rate": 4.859256054995758e-06,
+ "loss": 0.0069,
+ "step": 144
+ },
+ {
+ "epoch": 1.69140625,
+ "grad_norm": 0.40410447120666504,
+ "learning_rate": 4.8528501199489045e-06,
+ "loss": 0.0088,
+ "step": 145
+ },
+ {
+ "epoch": 1.703125,
+ "grad_norm": 0.5876736640930176,
+ "learning_rate": 4.846306043401268e-06,
+ "loss": 0.0057,
+ "step": 146
+ },
+ {
+ "epoch": 1.71484375,
+ "grad_norm": 0.5149250626564026,
+ "learning_rate": 4.839624209571352e-06,
+ "loss": 0.0056,
+ "step": 147
+ },
+ {
+ "epoch": 1.7265625,
+ "grad_norm": 0.7009180784225464,
+ "learning_rate": 4.832805010765724e-06,
+ "loss": 0.0088,
+ "step": 148
+ },
+ {
+ "epoch": 1.73828125,
+ "grad_norm": 0.42258334159851074,
+ "learning_rate": 4.8258488473559794e-06,
+ "loss": 0.004,
+ "step": 149
+ },
+ {
+ "epoch": 1.75,
+ "grad_norm": 0.39231887459754944,
+ "learning_rate": 4.8187561277552376e-06,
+ "loss": 0.005,
+ "step": 150
+ },
+ {
+ "epoch": 1.76171875,
+ "grad_norm": 0.3317432701587677,
+ "learning_rate": 4.811527268394157e-06,
+ "loss": 0.0038,
+ "step": 151
+ },
+ {
+ "epoch": 1.7734375,
+ "grad_norm": 0.5022267699241638,
+ "learning_rate": 4.804162693696494e-06,
+ "loss": 0.0056,
+ "step": 152
+ },
+ {
+ "epoch": 1.78515625,
+ "grad_norm": 0.39019322395324707,
+ "learning_rate": 4.796662836054176e-06,
+ "loss": 0.0053,
+ "step": 153
+ },
+ {
+ "epoch": 1.796875,
+ "grad_norm": 0.5674042701721191,
+ "learning_rate": 4.789028135801919e-06,
+ "loss": 0.007,
+ "step": 154
+ },
+ {
+ "epoch": 1.80859375,
+ "grad_norm": 0.5690024495124817,
+ "learning_rate": 4.7812590411913755e-06,
+ "loss": 0.0053,
+ "step": 155
+ },
+ {
+ "epoch": 1.8203125,
+ "grad_norm": 0.23775412142276764,
+ "learning_rate": 4.773356008364812e-06,
+ "loss": 0.0031,
+ "step": 156
+ },
+ {
+ "epoch": 1.83203125,
+ "grad_norm": 0.4698558747768402,
+ "learning_rate": 4.765319501328332e-06,
+ "loss": 0.0021,
+ "step": 157
+ },
+ {
+ "epoch": 1.84375,
+ "grad_norm": 0.21603639423847198,
+ "learning_rate": 4.757149991924633e-06,
+ "loss": 0.0046,
+ "step": 158
+ },
+ {
+ "epoch": 1.85546875,
+ "grad_norm": 0.33830726146698,
+ "learning_rate": 4.748847959805297e-06,
+ "loss": 0.0022,
+ "step": 159
+ },
+ {
+ "epoch": 1.8671875,
+ "grad_norm": 0.44919782876968384,
+ "learning_rate": 4.740413892402639e-06,
+ "loss": 0.0032,
+ "step": 160
+ },
+ {
+ "epoch": 1.87890625,
+ "grad_norm": 0.5119614601135254,
+ "learning_rate": 4.731848284901082e-06,
+ "loss": 0.006,
+ "step": 161
+ },
+ {
+ "epoch": 1.890625,
+ "grad_norm": 0.3875437080860138,
+ "learning_rate": 4.723151640208084e-06,
+ "loss": 0.0024,
+ "step": 162
+ },
+ {
+ "epoch": 1.90234375,
+ "grad_norm": 0.3179910182952881,
+ "learning_rate": 4.714324468924614e-06,
+ "loss": 0.0037,
+ "step": 163
+ },
+ {
+ "epoch": 1.9140625,
+ "grad_norm": 0.43395644426345825,
+ "learning_rate": 4.705367289315172e-06,
+ "loss": 0.0027,
+ "step": 164
+ },
+ {
+ "epoch": 1.92578125,
+ "grad_norm": 0.3703945577144623,
+ "learning_rate": 4.696280627277356e-06,
+ "loss": 0.0047,
+ "step": 165
+ },
+ {
+ "epoch": 1.9375,
+ "grad_norm": 0.2503529191017151,
+ "learning_rate": 4.687065016310996e-06,
+ "loss": 0.0052,
+ "step": 166
+ },
+ {
+ "epoch": 1.94921875,
+ "grad_norm": 0.3613075315952301,
+ "learning_rate": 4.6777209974868194e-06,
+ "loss": 0.0034,
+ "step": 167
+ },
+ {
+ "epoch": 1.9609375,
+ "grad_norm": 0.3578515350818634,
+ "learning_rate": 4.668249119414692e-06,
+ "loss": 0.0021,
+ "step": 168
+ },
+ {
+ "epoch": 1.97265625,
+ "grad_norm": 0.1784515529870987,
+ "learning_rate": 4.6586499382113985e-06,
+ "loss": 0.0018,
+ "step": 169
+ },
+ {
+ "epoch": 1.984375,
+ "grad_norm": 0.259198397397995,
+ "learning_rate": 4.648924017468003e-06,
+ "loss": 0.0009,
+ "step": 170
+ },
+ {
+ "epoch": 1.99609375,
+ "grad_norm": 0.7194133400917053,
+ "learning_rate": 4.6390719282167515e-06,
+ "loss": 0.0041,
+ "step": 171
+ },
+ {
+ "epoch": 2.0,
+ "grad_norm": 0.7194133400917053,
+ "learning_rate": 4.629094248897546e-06,
+ "loss": 0.0014,
+ "step": 172
+ },
+ {
+ "epoch": 2.01171875,
+ "grad_norm": 0.5032601952552795,
+ "learning_rate": 4.618991565323987e-06,
+ "loss": 0.0028,
+ "step": 173
+ },
+ {
+ "epoch": 2.0234375,
+ "grad_norm": 0.6387512683868408,
+ "learning_rate": 4.608764470648971e-06,
+ "loss": 0.0007,
+ "step": 174
+ },
+ {
+ "epoch": 2.03515625,
+ "grad_norm": 0.23177844285964966,
+ "learning_rate": 4.598413565329876e-06,
+ "loss": 0.0006,
+ "step": 175
+ },
+ {
+ "epoch": 2.046875,
+ "grad_norm": 0.1713147759437561,
+ "learning_rate": 4.587939457093296e-06,
+ "loss": 0.0003,
+ "step": 176
+ },
+ {
+ "epoch": 2.05859375,
+ "grad_norm": 0.06128697097301483,
+ "learning_rate": 4.577342760899368e-06,
+ "loss": 0.0001,
+ "step": 177
+ },
+ {
+ "epoch": 2.0703125,
+ "grad_norm": 0.538530170917511,
+ "learning_rate": 4.566624098905665e-06,
+ "loss": 0.0004,
+ "step": 178
+ },
+ {
+ "epoch": 2.08203125,
+ "grad_norm": 0.03301696106791496,
+ "learning_rate": 4.555784100430662e-06,
+ "loss": 0.0004,
+ "step": 179
+ },
+ {
+ "epoch": 2.09375,
+ "grad_norm": 0.21366432309150696,
+ "learning_rate": 4.544823401916794e-06,
+ "loss": 0.0014,
+ "step": 180
+ },
+ {
+ "epoch": 2.10546875,
+ "grad_norm": 0.13440090417861938,
+ "learning_rate": 4.533742646893086e-06,
+ "loss": 0.0004,
+ "step": 181
+ },
+ {
+ "epoch": 2.1171875,
+ "grad_norm": 0.531997799873352,
+ "learning_rate": 4.522542485937369e-06,
+ "loss": 0.0008,
+ "step": 182
+ },
+ {
+ "epoch": 2.12890625,
+ "grad_norm": 0.2832719385623932,
+ "learning_rate": 4.511223576638084e-06,
+ "loss": 0.0023,
+ "step": 183
+ },
+ {
+ "epoch": 2.140625,
+ "grad_norm": 0.3814002275466919,
+ "learning_rate": 4.499786583555675e-06,
+ "loss": 0.001,
+ "step": 184
+ },
+ {
+ "epoch": 2.15234375,
+ "grad_norm": 0.2522885501384735,
+ "learning_rate": 4.4882321781835666e-06,
+ "loss": 0.0004,
+ "step": 185
+ },
+ {
+ "epoch": 2.1640625,
+ "grad_norm": 0.3866797983646393,
+ "learning_rate": 4.476561038908745e-06,
+ "loss": 0.0007,
+ "step": 186
+ },
+ {
+ "epoch": 2.17578125,
+ "grad_norm": 0.2128417044878006,
+ "learning_rate": 4.464773850971924e-06,
+ "loss": 0.0001,
+ "step": 187
+ },
+ {
+ "epoch": 2.1875,
+ "grad_norm": 0.135880708694458,
+ "learning_rate": 4.452871306427314e-06,
+ "loss": 0.0031,
+ "step": 188
+ },
+ {
+ "epoch": 2.19921875,
+ "grad_norm": 0.38835451006889343,
+ "learning_rate": 4.440854104101988e-06,
+ "loss": 0.0015,
+ "step": 189
+ },
+ {
+ "epoch": 2.2109375,
+ "grad_norm": 0.18233123421669006,
+ "learning_rate": 4.428722949554858e-06,
+ "loss": 0.0001,
+ "step": 190
+ },
+ {
+ "epoch": 2.22265625,
+ "grad_norm": 0.10753051191568375,
+ "learning_rate": 4.416478555035241e-06,
+ "loss": 0.0017,
+ "step": 191
+ },
+ {
+ "epoch": 2.234375,
+ "grad_norm": 0.30138343572616577,
+ "learning_rate": 4.404121639441047e-06,
+ "loss": 0.0004,
+ "step": 192
+ },
+ {
+ "epoch": 2.24609375,
+ "grad_norm": 0.12771356105804443,
+ "learning_rate": 4.391652928276572e-06,
+ "loss": 0.0022,
+ "step": 193
+ },
+ {
+ "epoch": 2.2578125,
+ "grad_norm": 0.4173564612865448,
+ "learning_rate": 4.379073153609896e-06,
+ "loss": 0.0001,
+ "step": 194
+ },
+ {
+ "epoch": 2.26953125,
+ "grad_norm": 0.08329658955335617,
+ "learning_rate": 4.366383054029907e-06,
+ "loss": 0.0009,
+ "step": 195
+ },
+ {
+ "epoch": 2.28125,
+ "grad_norm": 0.21187439560890198,
+ "learning_rate": 4.3535833746029335e-06,
+ "loss": 0.0013,
+ "step": 196
+ },
+ {
+ "epoch": 2.29296875,
+ "grad_norm": 0.046030864119529724,
+ "learning_rate": 4.340674866829001e-06,
+ "loss": 0.0004,
+ "step": 197
+ },
+ {
+ "epoch": 2.3046875,
+ "grad_norm": 0.08373020589351654,
+ "learning_rate": 4.32765828859771e-06,
+ "loss": 0.0014,
+ "step": 198
+ },
+ {
+ "epoch": 2.31640625,
+ "grad_norm": 0.4026390314102173,
+ "learning_rate": 4.314534404143738e-06,
+ "loss": 0.0003,
+ "step": 199
+ },
+ {
+ "epoch": 2.328125,
+ "grad_norm": 0.24255593121051788,
+ "learning_rate": 4.3013039840019675e-06,
+ "loss": 0.0009,
+ "step": 200
+ },
+ {
+ "epoch": 2.33984375,
+ "grad_norm": 0.2282780110836029,
+ "learning_rate": 4.287967804962252e-06,
+ "loss": 0.0025,
+ "step": 201
+ },
+ {
+ "epoch": 2.3515625,
+ "grad_norm": 0.14743350446224213,
+ "learning_rate": 4.274526650023801e-06,
+ "loss": 0.0014,
+ "step": 202
+ },
+ {
+ "epoch": 2.36328125,
+ "grad_norm": 0.17971713840961456,
+ "learning_rate": 4.260981308349214e-06,
+ "loss": 0.0003,
+ "step": 203
+ },
+ {
+ "epoch": 2.375,
+ "grad_norm": 0.03872796148061752,
+ "learning_rate": 4.247332575218144e-06,
+ "loss": 0.0003,
+ "step": 204
+ },
+ {
+ "epoch": 2.38671875,
+ "grad_norm": 0.06636863946914673,
+ "learning_rate": 4.233581251980604e-06,
+ "loss": 0.0004,
+ "step": 205
+ },
+ {
+ "epoch": 2.3984375,
+ "grad_norm": 0.1254304051399231,
+ "learning_rate": 4.2197281460099245e-06,
+ "loss": 0.0002,
+ "step": 206
+ },
+ {
+ "epoch": 2.41015625,
+ "grad_norm": 0.03998701646924019,
+ "learning_rate": 4.2057740706553415e-06,
+ "loss": 0.0007,
+ "step": 207
+ },
+ {
+ "epoch": 2.421875,
+ "grad_norm": 0.8734745979309082,
+ "learning_rate": 4.191719845194246e-06,
+ "loss": 0.0019,
+ "step": 208
+ },
+ {
+ "epoch": 2.43359375,
+ "grad_norm": 0.34975236654281616,
+ "learning_rate": 4.177566294784085e-06,
+ "loss": 0.0006,
+ "step": 209
+ },
+ {
+ "epoch": 2.4453125,
+ "grad_norm": 0.07566183060407639,
+ "learning_rate": 4.163314250413913e-06,
+ "loss": 0.0003,
+ "step": 210
+ },
+ {
+ "epoch": 2.45703125,
+ "grad_norm": 0.09056711941957474,
+ "learning_rate": 4.148964548855603e-06,
+ "loss": 0.0002,
+ "step": 211
+ },
+ {
+ "epoch": 2.46875,
+ "grad_norm": 0.16160684823989868,
+ "learning_rate": 4.134518032614713e-06,
+ "loss": 0.0009,
+ "step": 212
+ },
+ {
+ "epoch": 2.48046875,
+ "grad_norm": 0.0812753438949585,
+ "learning_rate": 4.119975549881029e-06,
+ "loss": 0.0002,
+ "step": 213
+ },
+ {
+ "epoch": 2.4921875,
+ "grad_norm": 0.05827738344669342,
+ "learning_rate": 4.105337954478756e-06,
+ "loss": 0.0007,
+ "step": 214
+ },
+ {
+ "epoch": 2.50390625,
+ "grad_norm": 0.2625848054885864,
+ "learning_rate": 4.0906061058164e-06,
+ "loss": 0.0003,
+ "step": 215
+ },
+ {
+ "epoch": 2.515625,
+ "grad_norm": 0.1771923154592514,
+ "learning_rate": 4.075780868836296e-06,
+ "loss": 0.0005,
+ "step": 216
+ },
+ {
+ "epoch": 2.52734375,
+ "grad_norm": 0.034166041761636734,
+ "learning_rate": 4.060863113963835e-06,
+ "loss": 0.0012,
+ "step": 217
+ },
+ {
+ "epoch": 2.5390625,
+ "grad_norm": 0.14099521934986115,
+ "learning_rate": 4.045853717056358e-06,
+ "loss": 0.0,
+ "step": 218
+ },
+ {
+ "epoch": 2.55078125,
+ "grad_norm": 0.34704917669296265,
+ "learning_rate": 4.030753559351728e-06,
+ "loss": 0.0006,
+ "step": 219
+ },
+ {
+ "epoch": 2.5625,
+ "grad_norm": 0.25681111216545105,
+ "learning_rate": 4.015563527416596e-06,
+ "loss": 0.0004,
+ "step": 220
+ },
+ {
+ "epoch": 2.57421875,
+ "grad_norm": 0.36212408542633057,
+ "learning_rate": 4.000284513094342e-06,
+ "loss": 0.0003,
+ "step": 221
+ },
+ {
+ "epoch": 2.5859375,
+ "grad_norm": 0.13945375382900238,
+ "learning_rate": 3.984917413452721e-06,
+ "loss": 0.0001,
+ "step": 222
+ },
+ {
+ "epoch": 2.59765625,
+ "grad_norm": 0.06798060238361359,
+ "learning_rate": 3.969463130731183e-06,
+ "loss": 0.0007,
+ "step": 223
+ },
+ {
+ "epoch": 2.609375,
+ "grad_norm": 0.19848179817199707,
+ "learning_rate": 3.953922572287915e-06,
+ "loss": 0.0007,
+ "step": 224
+ },
+ {
+ "epoch": 2.62109375,
+ "grad_norm": 0.5454645156860352,
+ "learning_rate": 3.938296650546552e-06,
+ "loss": 0.0018,
+ "step": 225
+ },
+ {
+ "epoch": 2.6328125,
+ "grad_norm": 0.22043731808662415,
+ "learning_rate": 3.9225862829426184e-06,
+ "loss": 0.0036,
+ "step": 226
+ },
+ {
+ "epoch": 2.64453125,
+ "grad_norm": 0.3086087107658386,
+ "learning_rate": 3.906792391869657e-06,
+ "loss": 0.0002,
+ "step": 227
+ },
+ {
+ "epoch": 2.65625,
+ "grad_norm": 0.04387599974870682,
+ "learning_rate": 3.890915904625075e-06,
+ "loss": 0.0014,
+ "step": 228
+ },
+ {
+ "epoch": 2.66796875,
+ "grad_norm": 0.3786030113697052,
+ "learning_rate": 3.874957753355701e-06,
+ "loss": 0.0014,
+ "step": 229
+ },
+ {
+ "epoch": 2.6796875,
+ "grad_norm": 0.28310713171958923,
+ "learning_rate": 3.858918875003053e-06,
+ "loss": 0.0001,
+ "step": 230
+ },
+ {
+ "epoch": 2.69140625,
+ "grad_norm": 0.0586460717022419,
+ "learning_rate": 3.842800211248333e-06,
+ "loss": 0.0001,
+ "step": 231
+ },
+ {
+ "epoch": 2.703125,
+ "grad_norm": 0.11408677697181702,
+ "learning_rate": 3.8266027084571335e-06,
+ "loss": 0.001,
+ "step": 232
+ },
+ {
+ "epoch": 2.71484375,
+ "grad_norm": 0.06875021010637283,
+ "learning_rate": 3.810327317623881e-06,
+ "loss": 0.0001,
+ "step": 233
+ },
+ {
+ "epoch": 2.7265625,
+ "grad_norm": 0.037388525903224945,
+ "learning_rate": 3.793974994315991e-06,
+ "loss": 0.0002,
+ "step": 234
+ },
+ {
+ "epoch": 2.73828125,
+ "grad_norm": 0.041430581361055374,
+ "learning_rate": 3.7775466986177763e-06,
+ "loss": 0.0015,
+ "step": 235
+ },
+ {
+ "epoch": 2.75,
+ "grad_norm": 0.26019373536109924,
+ "learning_rate": 3.7610433950740667e-06,
+ "loss": 0.0022,
+ "step": 236
+ },
+ {
+ "epoch": 2.76171875,
+ "grad_norm": 0.16638831794261932,
+ "learning_rate": 3.7444660526335853e-06,
+ "loss": 0.0001,
+ "step": 237
+ },
+ {
+ "epoch": 2.7734375,
+ "grad_norm": 0.11822371184825897,
+ "learning_rate": 3.7278156445920584e-06,
+ "loss": 0.0004,
+ "step": 238
+ },
+ {
+ "epoch": 2.78515625,
+ "grad_norm": 0.055076126009225845,
+ "learning_rate": 3.711093148535068e-06,
+ "loss": 0.0001,
+ "step": 239
+ },
+ {
+ "epoch": 2.796875,
+ "grad_norm": 0.08209875971078873,
+ "learning_rate": 3.6942995462806574e-06,
+ "loss": 0.0012,
+ "step": 240
+ },
+ {
+ "epoch": 2.80859375,
+ "grad_norm": 0.10523220896720886,
+ "learning_rate": 3.6774358238216878e-06,
+ "loss": 0.0004,
+ "step": 241
+ },
+ {
+ "epoch": 2.8203125,
+ "grad_norm": 0.09211058169603348,
+ "learning_rate": 3.660502971267945e-06,
+ "loss": 0.0007,
+ "step": 242
+ },
+ {
+ "epoch": 2.83203125,
+ "grad_norm": 0.6209844946861267,
+ "learning_rate": 3.6435019827880093e-06,
+ "loss": 0.0004,
+ "step": 243
+ },
+ {
+ "epoch": 2.84375,
+ "grad_norm": 0.030900023877620697,
+ "learning_rate": 3.626433856550886e-06,
+ "loss": 0.0002,
+ "step": 244
+ },
+ {
+ "epoch": 2.85546875,
+ "grad_norm": 0.041130077093839645,
+ "learning_rate": 3.6092995946673996e-06,
+ "loss": 0.0003,
+ "step": 245
+ },
+ {
+ "epoch": 2.8671875,
+ "grad_norm": 0.052536819130182266,
+ "learning_rate": 3.5921002031313586e-06,
+ "loss": 0.0001,
+ "step": 246
+ },
+ {
+ "epoch": 2.87890625,
+ "grad_norm": 0.027478178963065147,
+ "learning_rate": 3.574836691760489e-06,
+ "loss": 0.0011,
+ "step": 247
+ },
+ {
+ "epoch": 2.890625,
+ "grad_norm": 0.11695867031812668,
+ "learning_rate": 3.557510074137147e-06,
+ "loss": 0.0002,
+ "step": 248
+ },
+ {
+ "epoch": 2.90234375,
+ "grad_norm": 0.08782754838466644,
+ "learning_rate": 3.540121367548811e-06,
+ "loss": 0.001,
+ "step": 249
+ },
+ {
+ "epoch": 2.9140625,
+ "grad_norm": 0.19123269617557526,
+ "learning_rate": 3.5226715929283507e-06,
+ "loss": 0.0001,
+ "step": 250
+ },
+ {
+ "epoch": 2.92578125,
+ "grad_norm": 0.020774945616722107,
+ "learning_rate": 3.505161774794085e-06,
+ "loss": 0.0006,
+ "step": 251
+ },
+ {
+ "epoch": 2.9375,
+ "grad_norm": 0.12062892317771912,
+ "learning_rate": 3.487592941189636e-06,
+ "loss": 0.0001,
+ "step": 252
+ },
+ {
+ "epoch": 2.94921875,
+ "grad_norm": 0.013076180592179298,
+ "learning_rate": 3.469966123623563e-06,
+ "loss": 0.0011,
+ "step": 253
+ },
+ {
+ "epoch": 2.9609375,
+ "grad_norm": 0.22065430879592896,
+ "learning_rate": 3.4522823570088073e-06,
+ "loss": 0.0001,
+ "step": 254
+ },
+ {
+ "epoch": 2.97265625,
+ "grad_norm": 0.027459079399704933,
+ "learning_rate": 3.434542679601922e-06,
+ "loss": 0.0003,
+ "step": 255
+ }
+ ],
+ "logging_steps": 1,
+ "max_steps": 510,
+ "num_input_tokens_seen": 0,
+ "num_train_epochs": 6,
+ "save_steps": 85,
+ "stateful_callbacks": {
+ "TrainerControl": {
+ "args": {
+ "should_epoch_stop": false,
+ "should_evaluate": false,
+ "should_log": false,
+ "should_save": true,
+ "should_training_stop": false
+ },
+ "attributes": {}
+ }
+ },
+ "total_flos": 6.407713087448678e+17,
+ "train_batch_size": 4,
+ "trial_name": null,
+ "trial_params": null
+}
diff --git a/checkpoint-255/training_args.bin b/checkpoint-255/training_args.bin
new file mode 100644
index 0000000000000000000000000000000000000000..31435c2b54979c306fa2a089f64bc8d21e1d21cf
--- /dev/null
+++ b/checkpoint-255/training_args.bin
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:ae0e02a237d0ed5071f0d2c656d0cc6fa0293647ec7cffc6f8d299311f592cdc
+size 8056
diff --git a/checkpoint-255/zero_to_fp32.py b/checkpoint-255/zero_to_fp32.py
new file mode 100644
index 0000000000000000000000000000000000000000..24cc342e78d1a006c782b3a4cd68d9ce786d8fd8
--- /dev/null
+++ b/checkpoint-255/zero_to_fp32.py
@@ -0,0 +1,604 @@
+#!/usr/bin/env python
+
+# Copyright (c) Microsoft Corporation.
+# SPDX-License-Identifier: Apache-2.0
+
+# DeepSpeed Team
+
+# This script extracts fp32 consolidated weights from a zero 1, 2 and 3 DeepSpeed checkpoints. It gets
+# copied into the top level checkpoint dir, so the user can easily do the conversion at any point in
+# the future. Once extracted, the weights don't require DeepSpeed and can be used in any
+# application.
+#
+# example: python zero_to_fp32.py . pytorch_model.bin
+
+import argparse
+import torch
+import glob
+import math
+import os
+import re
+from collections import OrderedDict
+from dataclasses import dataclass
+
+# while this script doesn't use deepspeed to recover data, since the checkpoints are pickled with
+# DeepSpeed data structures it has to be available in the current python environment.
+from deepspeed.utils import logger
+from deepspeed.checkpoint.constants import (DS_VERSION, OPTIMIZER_STATE_DICT, SINGLE_PARTITION_OF_FP32_GROUPS,
+ FP32_FLAT_GROUPS, ZERO_STAGE, PARTITION_COUNT, PARAM_SHAPES, BUFFER_NAMES,
+ FROZEN_PARAM_SHAPES, FROZEN_PARAM_FRAGMENTS)
+
+
+@dataclass
+class zero_model_state:
+ buffers: dict()
+ param_shapes: dict()
+ shared_params: list
+ ds_version: int
+ frozen_param_shapes: dict()
+ frozen_param_fragments: dict()
+
+
+debug = 0
+
+# load to cpu
+device = torch.device('cpu')
+
+
+def atoi(text):
+ return int(text) if text.isdigit() else text
+
+
+def natural_keys(text):
+ '''
+ alist.sort(key=natural_keys) sorts in human order
+ http://nedbatchelder.com/blog/200712/human_sorting.html
+ (See Toothy's implementation in the comments)
+ '''
+ return [atoi(c) for c in re.split(r'(\d+)', text)]
+
+
+def get_model_state_file(checkpoint_dir, zero_stage):
+ if not os.path.isdir(checkpoint_dir):
+ raise FileNotFoundError(f"Directory '{checkpoint_dir}' doesn't exist")
+
+ # there should be only one file
+ if zero_stage <= 2:
+ file = os.path.join(checkpoint_dir, "mp_rank_00_model_states.pt")
+ elif zero_stage == 3:
+ file = os.path.join(checkpoint_dir, "zero_pp_rank_0_mp_rank_00_model_states.pt")
+
+ if not os.path.exists(file):
+ raise FileNotFoundError(f"can't find model states file at '{file}'")
+
+ return file
+
+
+def get_checkpoint_files(checkpoint_dir, glob_pattern):
+ # XXX: need to test that this simple glob rule works for multi-node setup too
+ ckpt_files = sorted(glob.glob(os.path.join(checkpoint_dir, glob_pattern)), key=natural_keys)
+
+ if len(ckpt_files) == 0:
+ raise FileNotFoundError(f"can't find {glob_pattern} files in directory '{checkpoint_dir}'")
+
+ return ckpt_files
+
+
+def get_optim_files(checkpoint_dir):
+ return get_checkpoint_files(checkpoint_dir, "*_optim_states.pt")
+
+
+def get_model_state_files(checkpoint_dir):
+ return get_checkpoint_files(checkpoint_dir, "*_model_states.pt")
+
+
+def parse_model_states(files):
+ zero_model_states = []
+ for file in files:
+ state_dict = torch.load(file, map_location=device)
+
+ if BUFFER_NAMES not in state_dict:
+ raise ValueError(f"{file} is not a model state checkpoint")
+ buffer_names = state_dict[BUFFER_NAMES]
+ if debug:
+ print("Found buffers:", buffer_names)
+
+ # recover just the buffers while restoring them to fp32 if they were saved in fp16
+ buffers = {k: v.float() for k, v in state_dict["module"].items() if k in buffer_names}
+ param_shapes = state_dict[PARAM_SHAPES]
+
+ # collect parameters that are included in param_shapes
+ param_names = []
+ for s in param_shapes:
+ for name in s.keys():
+ param_names.append(name)
+
+ # update with frozen parameters
+ frozen_param_shapes = state_dict.get(FROZEN_PARAM_SHAPES, None)
+ if frozen_param_shapes is not None:
+ if debug:
+ print(f"Found frozen_param_shapes: {frozen_param_shapes}")
+ param_names += list(frozen_param_shapes.keys())
+
+ # handle shared params
+ shared_params = [[k, v] for k, v in state_dict["shared_params"].items()]
+
+ ds_version = state_dict.get(DS_VERSION, None)
+
+ frozen_param_fragments = state_dict.get(FROZEN_PARAM_FRAGMENTS, None)
+
+ z_model_state = zero_model_state(buffers=buffers,
+ param_shapes=param_shapes,
+ shared_params=shared_params,
+ ds_version=ds_version,
+ frozen_param_shapes=frozen_param_shapes,
+ frozen_param_fragments=frozen_param_fragments)
+ zero_model_states.append(z_model_state)
+
+ return zero_model_states
+
+
+def parse_optim_states(files, ds_checkpoint_dir):
+
+ total_files = len(files)
+ state_dicts = []
+ for f in files:
+ state_dict = torch.load(f, map_location=device)
+ # immediately discard the potentially huge 2 optimizer states as we only care for fp32 master weights
+ # and also handle the case where it was already removed by another helper script
+ state_dict["optimizer_state_dict"].pop("optimizer_state_dict", None)
+ state_dicts.append(state_dict)
+
+ if not ZERO_STAGE in state_dicts[0][OPTIMIZER_STATE_DICT]:
+ raise ValueError(f"{files[0]} is not a zero checkpoint")
+ zero_stage = state_dicts[0][OPTIMIZER_STATE_DICT][ZERO_STAGE]
+ world_size = state_dicts[0][OPTIMIZER_STATE_DICT][PARTITION_COUNT]
+
+ # For ZeRO-2 each param group can have different partition_count as data parallelism for expert
+ # parameters can be different from data parallelism for non-expert parameters. So we can just
+ # use the max of the partition_count to get the dp world_size.
+
+ if type(world_size) is list:
+ world_size = max(world_size)
+
+ if world_size != total_files:
+ raise ValueError(
+ f"Expected {world_size} of '*_optim_states.pt' under '{ds_checkpoint_dir}' but found {total_files} files. "
+ "Possibly due to an overwrite of an old checkpoint, or a checkpoint didn't get saved by one or more processes."
+ )
+
+ # the groups are named differently in each stage
+ if zero_stage <= 2:
+ fp32_groups_key = SINGLE_PARTITION_OF_FP32_GROUPS
+ elif zero_stage == 3:
+ fp32_groups_key = FP32_FLAT_GROUPS
+ else:
+ raise ValueError(f"unknown zero stage {zero_stage}")
+
+ if zero_stage <= 2:
+ fp32_flat_groups = [state_dicts[i][OPTIMIZER_STATE_DICT][fp32_groups_key] for i in range(len(state_dicts))]
+ elif zero_stage == 3:
+ # if there is more than one param group, there will be multiple flattened tensors - one
+ # flattened tensor per group - for simplicity merge them into a single tensor
+ #
+ # XXX: could make the script more memory efficient for when there are multiple groups - it
+ # will require matching the sub-lists of param_shapes for each param group flattened tensor
+
+ fp32_flat_groups = [
+ torch.cat(state_dicts[i][OPTIMIZER_STATE_DICT][fp32_groups_key], 0) for i in range(len(state_dicts))
+ ]
+
+ return zero_stage, world_size, fp32_flat_groups
+
+
+def _get_fp32_state_dict_from_zero_checkpoint(ds_checkpoint_dir, exclude_frozen_parameters):
+ """
+ Returns fp32 state_dict reconstructed from ds checkpoint
+
+ Args:
+ - ``ds_checkpoint_dir``: path to the deepspeed checkpoint folder (where the optimizer files are)
+
+ """
+ print(f"Processing zero checkpoint '{ds_checkpoint_dir}'")
+
+ optim_files = get_optim_files(ds_checkpoint_dir)
+ zero_stage, world_size, fp32_flat_groups = parse_optim_states(optim_files, ds_checkpoint_dir)
+ print(f"Detected checkpoint of type zero stage {zero_stage}, world_size: {world_size}")
+
+ model_files = get_model_state_files(ds_checkpoint_dir)
+
+ zero_model_states = parse_model_states(model_files)
+ print(f'Parsing checkpoint created by deepspeed=={zero_model_states[0].ds_version}')
+
+ if zero_stage <= 2:
+ return _get_fp32_state_dict_from_zero2_checkpoint(world_size, fp32_flat_groups, zero_model_states,
+ exclude_frozen_parameters)
+ elif zero_stage == 3:
+ return _get_fp32_state_dict_from_zero3_checkpoint(world_size, fp32_flat_groups, zero_model_states,
+ exclude_frozen_parameters)
+
+
+def _zero2_merge_frozen_params(state_dict, zero_model_states):
+ if zero_model_states[0].frozen_param_shapes is None or len(zero_model_states[0].frozen_param_shapes) == 0:
+ return
+
+ frozen_param_shapes = zero_model_states[0].frozen_param_shapes
+ frozen_param_fragments = zero_model_states[0].frozen_param_fragments
+
+ if debug:
+ num_elem = sum(s.numel() for s in frozen_param_shapes.values())
+ print(f'rank 0: {FROZEN_PARAM_SHAPES}.numel = {num_elem}')
+
+ wanted_params = len(frozen_param_shapes)
+ wanted_numel = sum(s.numel() for s in frozen_param_shapes.values())
+ avail_numel = sum([p.numel() for p in frozen_param_fragments.values()])
+ print(f'Frozen params: Have {avail_numel} numels to process.')
+ print(f'Frozen params: Need {wanted_numel} numels in {wanted_params} params')
+
+ total_params = 0
+ total_numel = 0
+ for name, shape in frozen_param_shapes.items():
+ total_params += 1
+ unpartitioned_numel = shape.numel()
+ total_numel += unpartitioned_numel
+
+ state_dict[name] = frozen_param_fragments[name]
+
+ if debug:
+ print(f"{name} full shape: {shape} unpartitioned numel {unpartitioned_numel} ")
+
+ print(f"Reconstructed Frozen fp32 state dict with {total_params} params {total_numel} elements")
+
+
+def _has_callable(obj, fn):
+ attr = getattr(obj, fn, None)
+ return callable(attr)
+
+
+def _zero2_merge_trainable_params(state_dict, world_size, fp32_flat_groups, zero_model_states):
+ param_shapes = zero_model_states[0].param_shapes
+
+ # Reconstruction protocol:
+ #
+ # XXX: document this
+
+ if debug:
+ for i in range(world_size):
+ for j in range(len(fp32_flat_groups[0])):
+ print(f"{FP32_FLAT_GROUPS}[{i}][{j}].shape={fp32_flat_groups[i][j].shape}")
+
+ # XXX: memory usage doubles here (zero2)
+ num_param_groups = len(fp32_flat_groups[0])
+ merged_single_partition_of_fp32_groups = []
+ for i in range(num_param_groups):
+ merged_partitions = [sd[i] for sd in fp32_flat_groups]
+ full_single_fp32_vector = torch.cat(merged_partitions, 0)
+ merged_single_partition_of_fp32_groups.append(full_single_fp32_vector)
+ avail_numel = sum(
+ [full_single_fp32_vector.numel() for full_single_fp32_vector in merged_single_partition_of_fp32_groups])
+
+ if debug:
+ wanted_params = sum([len(shapes) for shapes in param_shapes])
+ wanted_numel = sum([sum(shape.numel() for shape in shapes.values()) for shapes in param_shapes])
+ # not asserting if there is a mismatch due to possible padding
+ print(f"Have {avail_numel} numels to process.")
+ print(f"Need {wanted_numel} numels in {wanted_params} params.")
+
+ # params
+ # XXX: for huge models that can't fit into the host's RAM we will have to recode this to support
+ # out-of-core computing solution
+ total_numel = 0
+ total_params = 0
+ for shapes, full_single_fp32_vector in zip(param_shapes, merged_single_partition_of_fp32_groups):
+ offset = 0
+ avail_numel = full_single_fp32_vector.numel()
+ for name, shape in shapes.items():
+
+ unpartitioned_numel = shape.numel() if _has_callable(shape, 'numel') else math.prod(shape)
+ total_numel += unpartitioned_numel
+ total_params += 1
+
+ if debug:
+ print(f"{name} full shape: {shape} unpartitioned numel {unpartitioned_numel} ")
+ state_dict[name] = full_single_fp32_vector.narrow(0, offset, unpartitioned_numel).view(shape)
+ offset += unpartitioned_numel
+
+ # Z2 started to align to 2*world_size to improve nccl performance. Therefore both offset and
+ # avail_numel can differ by anywhere between 0..2*world_size. Due to two unrelated complex
+ # paddings performed in the code it's almost impossible to predict the exact numbers w/o the
+ # live optimizer object, so we are checking that the numbers are within the right range
+ align_to = 2 * world_size
+
+ def zero2_align(x):
+ return align_to * math.ceil(x / align_to)
+
+ if debug:
+ print(f"original offset={offset}, avail_numel={avail_numel}")
+
+ offset = zero2_align(offset)
+ avail_numel = zero2_align(avail_numel)
+
+ if debug:
+ print(f"aligned offset={offset}, avail_numel={avail_numel}")
+
+ # Sanity check
+ if offset != avail_numel:
+ raise ValueError(f"consumed {offset} numels out of {avail_numel} - something is wrong")
+
+ print(f"Reconstructed fp32 state dict with {total_params} params {total_numel} elements")
+
+
+def _get_fp32_state_dict_from_zero2_checkpoint(world_size, fp32_flat_groups, zero_model_states,
+ exclude_frozen_parameters):
+ state_dict = OrderedDict()
+
+ # buffers
+ buffers = zero_model_states[0].buffers
+ state_dict.update(buffers)
+ if debug:
+ print(f"added {len(buffers)} buffers")
+
+ if not exclude_frozen_parameters:
+ _zero2_merge_frozen_params(state_dict, zero_model_states)
+
+ _zero2_merge_trainable_params(state_dict, world_size, fp32_flat_groups, zero_model_states)
+
+ # recover shared parameters
+ for pair in zero_model_states[0].shared_params:
+ if pair[1] in state_dict:
+ state_dict[pair[0]] = state_dict[pair[1]]
+
+ return state_dict
+
+
+def zero3_partitioned_param_info(unpartitioned_numel, world_size):
+ remainder = unpartitioned_numel % world_size
+ padding_numel = (world_size - remainder) if remainder else 0
+ partitioned_numel = math.ceil(unpartitioned_numel / world_size)
+ return partitioned_numel, padding_numel
+
+
+def _zero3_merge_frozen_params(state_dict, world_size, zero_model_states):
+ if zero_model_states[0].frozen_param_shapes is None or len(zero_model_states[0].frozen_param_shapes) == 0:
+ return
+
+ if debug:
+ for i in range(world_size):
+ num_elem = sum(s.numel() for s in zero_model_states[i].frozen_param_fragments.values())
+ print(f'rank {i}: {FROZEN_PARAM_SHAPES}.numel = {num_elem}')
+
+ frozen_param_shapes = zero_model_states[0].frozen_param_shapes
+ wanted_params = len(frozen_param_shapes)
+ wanted_numel = sum(s.numel() for s in frozen_param_shapes.values())
+ avail_numel = sum([p.numel() for p in zero_model_states[0].frozen_param_fragments.values()]) * world_size
+ print(f'Frozen params: Have {avail_numel} numels to process.')
+ print(f'Frozen params: Need {wanted_numel} numels in {wanted_params} params')
+
+ total_params = 0
+ total_numel = 0
+ for name, shape in zero_model_states[0].frozen_param_shapes.items():
+ total_params += 1
+ unpartitioned_numel = shape.numel()
+ total_numel += unpartitioned_numel
+
+ param_frags = tuple(model_state.frozen_param_fragments[name] for model_state in zero_model_states)
+ state_dict[name] = torch.cat(param_frags, 0).narrow(0, 0, unpartitioned_numel).view(shape)
+
+ partitioned_numel, partitioned_padding_numel = zero3_partitioned_param_info(unpartitioned_numel, world_size)
+
+ if debug:
+ print(
+ f"Frozen params: {total_params} {name} full shape: {shape} partition0 numel={partitioned_numel} partitioned_padding_numel={partitioned_padding_numel}"
+ )
+
+ print(f"Reconstructed Frozen fp32 state dict with {total_params} params {total_numel} elements")
+
+
+def _zero3_merge_trainable_params(state_dict, world_size, fp32_flat_groups, zero_model_states):
+ param_shapes = zero_model_states[0].param_shapes
+ avail_numel = fp32_flat_groups[0].numel() * world_size
+ # Reconstruction protocol: For zero3 we need to zip the partitions together at boundary of each
+ # param, re-consolidating each param, while dealing with padding if any
+
+ # merge list of dicts, preserving order
+ param_shapes = {k: v for d in param_shapes for k, v in d.items()}
+
+ if debug:
+ for i in range(world_size):
+ print(f"{FP32_FLAT_GROUPS}[{i}].shape={fp32_flat_groups[i].shape}")
+
+ wanted_params = len(param_shapes)
+ wanted_numel = sum(shape.numel() for shape in param_shapes.values())
+ # not asserting if there is a mismatch due to possible padding
+ avail_numel = fp32_flat_groups[0].numel() * world_size
+ print(f"Trainable params: Have {avail_numel} numels to process.")
+ print(f"Trainable params: Need {wanted_numel} numels in {wanted_params} params.")
+
+ # params
+ # XXX: for huge models that can't fit into the host's RAM we will have to recode this to support
+ # out-of-core computing solution
+ offset = 0
+ total_numel = 0
+ total_params = 0
+ for name, shape in param_shapes.items():
+
+ unpartitioned_numel = shape.numel()
+ total_numel += unpartitioned_numel
+ total_params += 1
+
+ partitioned_numel, partitioned_padding_numel = zero3_partitioned_param_info(unpartitioned_numel, world_size)
+
+ if debug:
+ print(
+ f"Trainable params: {total_params} {name} full shape: {shape} partition0 numel={partitioned_numel} partitioned_padding_numel={partitioned_padding_numel}"
+ )
+
+ # XXX: memory usage doubles here
+ state_dict[name] = torch.cat(
+ tuple(fp32_flat_groups[i].narrow(0, offset, partitioned_numel) for i in range(world_size)),
+ 0).narrow(0, 0, unpartitioned_numel).view(shape)
+ offset += partitioned_numel
+
+ offset *= world_size
+
+ # Sanity check
+ if offset != avail_numel:
+ raise ValueError(f"consumed {offset} numels out of {avail_numel} - something is wrong")
+
+ print(f"Reconstructed Trainable fp32 state dict with {total_params} params {total_numel} elements")
+
+
+def _get_fp32_state_dict_from_zero3_checkpoint(world_size, fp32_flat_groups, zero_model_states,
+ exclude_frozen_parameters):
+ state_dict = OrderedDict()
+
+ # buffers
+ buffers = zero_model_states[0].buffers
+ state_dict.update(buffers)
+ if debug:
+ print(f"added {len(buffers)} buffers")
+
+ if not exclude_frozen_parameters:
+ _zero3_merge_frozen_params(state_dict, world_size, zero_model_states)
+
+ _zero3_merge_trainable_params(state_dict, world_size, fp32_flat_groups, zero_model_states)
+
+ # recover shared parameters
+ for pair in zero_model_states[0].shared_params:
+ if pair[1] in state_dict:
+ state_dict[pair[0]] = state_dict[pair[1]]
+
+ return state_dict
+
+
+def get_fp32_state_dict_from_zero_checkpoint(checkpoint_dir, tag=None, exclude_frozen_parameters=False):
+ """
+ Convert ZeRO 2 or 3 checkpoint into a single fp32 consolidated state_dict that can be loaded with
+ ``load_state_dict()`` and used for training without DeepSpeed or shared with others, for example
+ via a model hub.
+
+ Args:
+ - ``checkpoint_dir``: path to the desired checkpoint folder
+ - ``tag``: checkpoint tag used as a unique identifier for checkpoint. If not provided will attempt to load tag in 'latest' file. e.g., ``global_step14``
+ - ``exclude_frozen_parameters``: exclude frozen parameters
+
+ Returns:
+ - pytorch ``state_dict``
+
+ Note: this approach may not work if your application doesn't have sufficient free CPU memory and
+ you may need to use the offline approach using the ``zero_to_fp32.py`` script that is saved with
+ the checkpoint.
+
+ A typical usage might be ::
+
+ from deepspeed.utils.zero_to_fp32 import get_fp32_state_dict_from_zero_checkpoint
+ # do the training and checkpoint saving
+ state_dict = get_fp32_state_dict_from_zero_checkpoint(checkpoint_dir) # already on cpu
+ model = model.cpu() # move to cpu
+ model.load_state_dict(state_dict)
+ # submit to model hub or save the model to share with others
+
+ In this example the ``model`` will no longer be usable in the deepspeed context of the same
+ application. i.e. you will need to re-initialize the deepspeed engine, since
+ ``model.load_state_dict(state_dict)`` will remove all the deepspeed magic from it.
+
+ If you want it all done for you, use ``load_state_dict_from_zero_checkpoint`` instead.
+
+ """
+ if tag is None:
+ latest_path = os.path.join(checkpoint_dir, 'latest')
+ if os.path.isfile(latest_path):
+ with open(latest_path, 'r') as fd:
+ tag = fd.read().strip()
+ else:
+ raise ValueError(f"Unable to find 'latest' file at {latest_path}")
+
+ ds_checkpoint_dir = os.path.join(checkpoint_dir, tag)
+
+ if not os.path.isdir(ds_checkpoint_dir):
+ raise FileNotFoundError(f"Directory '{ds_checkpoint_dir}' doesn't exist")
+
+ return _get_fp32_state_dict_from_zero_checkpoint(ds_checkpoint_dir, exclude_frozen_parameters)
+
+
+def convert_zero_checkpoint_to_fp32_state_dict(checkpoint_dir, output_file, tag=None, exclude_frozen_parameters=False):
+ """
+ Convert ZeRO 2 or 3 checkpoint into a single fp32 consolidated ``state_dict`` file that can be
+ loaded with ``torch.load(file)`` + ``load_state_dict()`` and used for training without DeepSpeed.
+
+ Args:
+ - ``checkpoint_dir``: path to the desired checkpoint folder. (one that contains the tag-folder, like ``global_step14``)
+ - ``output_file``: path to the pytorch fp32 state_dict output file (e.g. path/pytorch_model.bin)
+ - ``tag``: checkpoint tag used as a unique identifier for checkpoint. If not provided will attempt to load tag in the file named ``latest`` in the checkpoint folder, e.g., ``global_step14``
+ - ``exclude_frozen_parameters``: exclude frozen parameters
+ """
+
+ state_dict = get_fp32_state_dict_from_zero_checkpoint(checkpoint_dir, tag, exclude_frozen_parameters)
+ print(f"Saving fp32 state dict to {output_file}")
+ torch.save(state_dict, output_file)
+
+
+def load_state_dict_from_zero_checkpoint(model, checkpoint_dir, tag=None):
+ """
+ 1. Put the provided model to cpu
+ 2. Convert ZeRO 2 or 3 checkpoint into a single fp32 consolidated ``state_dict``
+ 3. Load it into the provided model
+
+ Args:
+ - ``model``: the model object to update
+ - ``checkpoint_dir``: path to the desired checkpoint folder. (one that contains the tag-folder, like ``global_step14``)
+ - ``tag``: checkpoint tag used as a unique identifier for checkpoint. If not provided will attempt to load tag in the file named ``latest`` in the checkpoint folder, e.g., ``global_step14``
+
+ Returns:
+ - ``model`: modified model
+
+ Make sure you have plenty of CPU memory available before you call this function. If you don't
+ have enough use the ``zero_to_fp32.py`` utility to do the conversion. You will find it
+ conveniently placed for you in the checkpoint folder.
+
+ A typical usage might be ::
+
+ from deepspeed.utils.zero_to_fp32 import load_state_dict_from_zero_checkpoint
+ model = load_state_dict_from_zero_checkpoint(trainer.model, checkpoint_dir)
+ # submit to model hub or save the model to share with others
+
+ Note, that once this was run, the ``model`` will no longer be usable in the deepspeed context
+ of the same application. i.e. you will need to re-initialize the deepspeed engine, since
+ ``model.load_state_dict(state_dict)`` will remove all the deepspeed magic from it.
+
+ """
+ logger.info(f"Extracting fp32 weights")
+ state_dict = get_fp32_state_dict_from_zero_checkpoint(checkpoint_dir, tag)
+
+ logger.info(f"Overwriting model with fp32 weights")
+ model = model.cpu()
+ model.load_state_dict(state_dict, strict=False)
+
+ return model
+
+
+if __name__ == "__main__":
+
+ parser = argparse.ArgumentParser()
+ parser.add_argument("checkpoint_dir",
+ type=str,
+ help="path to the desired checkpoint folder, e.g., path/checkpoint-12")
+ parser.add_argument(
+ "output_file",
+ type=str,
+ help="path to the pytorch fp32 state_dict output file (e.g. path/checkpoint-12/pytorch_model.bin)")
+ parser.add_argument("-t",
+ "--tag",
+ type=str,
+ default=None,
+ help="checkpoint tag used as a unique identifier for checkpoint. e.g., global_step1")
+ parser.add_argument("--exclude_frozen_parameters", action='store_true', help="exclude frozen parameters")
+ parser.add_argument("-d", "--debug", action='store_true', help="enable debug")
+ args = parser.parse_args()
+
+ debug = args.debug
+
+ convert_zero_checkpoint_to_fp32_state_dict(args.checkpoint_dir,
+ args.output_file,
+ tag=args.tag,
+ exclude_frozen_parameters=args.exclude_frozen_parameters)
diff --git a/checkpoint-340/README.md b/checkpoint-340/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..be5c87703f12b547886cc6a2ecfbe9ee150496fa
--- /dev/null
+++ b/checkpoint-340/README.md
@@ -0,0 +1,202 @@
+---
+base_model: meta-llama/Llama-3.1-8B-Instruct
+library_name: peft
+---
+
+# Model Card for Model ID
+
+
+
+
+
+## Model Details
+
+### Model Description
+
+
+
+
+
+- **Developed by:** [More Information Needed]
+- **Funded by [optional]:** [More Information Needed]
+- **Shared by [optional]:** [More Information Needed]
+- **Model type:** [More Information Needed]
+- **Language(s) (NLP):** [More Information Needed]
+- **License:** [More Information Needed]
+- **Finetuned from model [optional]:** [More Information Needed]
+
+### Model Sources [optional]
+
+
+
+- **Repository:** [More Information Needed]
+- **Paper [optional]:** [More Information Needed]
+- **Demo [optional]:** [More Information Needed]
+
+## Uses
+
+
+
+### Direct Use
+
+
+
+[More Information Needed]
+
+### Downstream Use [optional]
+
+
+
+[More Information Needed]
+
+### Out-of-Scope Use
+
+
+
+[More Information Needed]
+
+## Bias, Risks, and Limitations
+
+
+
+[More Information Needed]
+
+### Recommendations
+
+
+
+Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
+
+## How to Get Started with the Model
+
+Use the code below to get started with the model.
+
+[More Information Needed]
+
+## Training Details
+
+### Training Data
+
+
+
+[More Information Needed]
+
+### Training Procedure
+
+
+
+#### Preprocessing [optional]
+
+[More Information Needed]
+
+
+#### Training Hyperparameters
+
+- **Training regime:** [More Information Needed]
+
+#### Speeds, Sizes, Times [optional]
+
+
+
+[More Information Needed]
+
+## Evaluation
+
+
+
+### Testing Data, Factors & Metrics
+
+#### Testing Data
+
+
+
+[More Information Needed]
+
+#### Factors
+
+
+
+[More Information Needed]
+
+#### Metrics
+
+
+
+[More Information Needed]
+
+### Results
+
+[More Information Needed]
+
+#### Summary
+
+
+
+## Model Examination [optional]
+
+
+
+[More Information Needed]
+
+## Environmental Impact
+
+
+
+Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
+
+- **Hardware Type:** [More Information Needed]
+- **Hours used:** [More Information Needed]
+- **Cloud Provider:** [More Information Needed]
+- **Compute Region:** [More Information Needed]
+- **Carbon Emitted:** [More Information Needed]
+
+## Technical Specifications [optional]
+
+### Model Architecture and Objective
+
+[More Information Needed]
+
+### Compute Infrastructure
+
+[More Information Needed]
+
+#### Hardware
+
+[More Information Needed]
+
+#### Software
+
+[More Information Needed]
+
+## Citation [optional]
+
+
+
+**BibTeX:**
+
+[More Information Needed]
+
+**APA:**
+
+[More Information Needed]
+
+## Glossary [optional]
+
+
+
+[More Information Needed]
+
+## More Information [optional]
+
+[More Information Needed]
+
+## Model Card Authors [optional]
+
+[More Information Needed]
+
+## Model Card Contact
+
+[More Information Needed]
+### Framework versions
+
+- PEFT 0.14.0
\ No newline at end of file
diff --git a/checkpoint-340/adapter_config.json b/checkpoint-340/adapter_config.json
new file mode 100644
index 0000000000000000000000000000000000000000..9dfb3ab60881d002c4cdbcc157a93958018fe683
--- /dev/null
+++ b/checkpoint-340/adapter_config.json
@@ -0,0 +1,40 @@
+{
+ "alpha_pattern": {},
+ "auto_mapping": null,
+ "base_model_name_or_path": "meta-llama/Llama-3.1-8B-Instruct",
+ "bias": "none",
+ "eva_config": null,
+ "exclude_modules": null,
+ "fan_in_fan_out": null,
+ "inference_mode": true,
+ "init_lora_weights": true,
+ "layer_replication": null,
+ "layers_pattern": null,
+ "layers_to_transform": null,
+ "loftq_config": {},
+ "lora_alpha": 512,
+ "lora_bias": false,
+ "lora_dropout": 0.05,
+ "megatron_config": null,
+ "megatron_core": "megatron.core",
+ "modules_to_save": [
+ "embed_tokens",
+ "lm_head"
+ ],
+ "peft_type": "LORA",
+ "r": 256,
+ "rank_pattern": {},
+ "revision": null,
+ "target_modules": [
+ "v_proj",
+ "up_proj",
+ "q_proj",
+ "o_proj",
+ "down_proj",
+ "gate_proj",
+ "k_proj"
+ ],
+ "task_type": "CAUSAL_LM",
+ "use_dora": false,
+ "use_rslora": false
+}
\ No newline at end of file
diff --git a/checkpoint-340/adapter_model.safetensors b/checkpoint-340/adapter_model.safetensors
new file mode 100644
index 0000000000000000000000000000000000000000..b032a5c450c1725689f5cc227f69b5cb101462e0
--- /dev/null
+++ b/checkpoint-340/adapter_model.safetensors
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:5fb3b4891ca6a5aebd5b2be3f37cb7fb9bc6d37f0f01ee305fff0657b2581e4c
+size 3443586272
diff --git a/checkpoint-340/global_step338/bf16_zero_pp_rank_0_mp_rank_00_optim_states.pt b/checkpoint-340/global_step338/bf16_zero_pp_rank_0_mp_rank_00_optim_states.pt
new file mode 100644
index 0000000000000000000000000000000000000000..ecc7a8c483726db646c684772d7a6117033d5706
--- /dev/null
+++ b/checkpoint-340/global_step338/bf16_zero_pp_rank_0_mp_rank_00_optim_states.pt
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:9e75d955f5522730574eca7938571da55a934633e2b7a8eb72eba99b7efae924
+size 20661195036
diff --git a/checkpoint-340/global_step338/mp_rank_00_model_states.pt b/checkpoint-340/global_step338/mp_rank_00_model_states.pt
new file mode 100644
index 0000000000000000000000000000000000000000..675450125e621413dd66b2e1bf6cb302e76d6d19
--- /dev/null
+++ b/checkpoint-340/global_step338/mp_rank_00_model_states.pt
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:38b4fd881cc6009b626ce387641533f1d5ca9cab044074cce24c7ae97d02b459
+size 3555326649
diff --git a/checkpoint-340/latest b/checkpoint-340/latest
new file mode 100644
index 0000000000000000000000000000000000000000..8f9396e1c5ef87f9cde1f9794475073bc18d36e7
--- /dev/null
+++ b/checkpoint-340/latest
@@ -0,0 +1 @@
+global_step338
\ No newline at end of file
diff --git a/checkpoint-340/rng_state.pth b/checkpoint-340/rng_state.pth
new file mode 100644
index 0000000000000000000000000000000000000000..6fe29ac6ea727277162a9e295e727e1cf44ab749
--- /dev/null
+++ b/checkpoint-340/rng_state.pth
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:a33b616ca4cbe1fe45d19171db9f73561015cf29364aae14d288bd040d5fbdbb
+size 14244
diff --git a/checkpoint-340/scheduler.pt b/checkpoint-340/scheduler.pt
new file mode 100644
index 0000000000000000000000000000000000000000..2d088b87d412f35e52da7c2215836ab826cc1fe0
--- /dev/null
+++ b/checkpoint-340/scheduler.pt
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:a9d14e63423790833ee1bd6753de5623b963eb3e049d15053cdb103b84b48ef3
+size 1064
diff --git a/checkpoint-340/special_tokens_map.json b/checkpoint-340/special_tokens_map.json
new file mode 100644
index 0000000000000000000000000000000000000000..278b7f0f84be865c4687700ee7b3c63d89a51e18
--- /dev/null
+++ b/checkpoint-340/special_tokens_map.json
@@ -0,0 +1,23 @@
+{
+ "bos_token": {
+ "content": "<|begin_of_text|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false
+ },
+ "eos_token": {
+ "content": "<|eot_id|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false
+ },
+ "pad_token": {
+ "content": "<|end_of_text|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false
+ }
+}
diff --git a/checkpoint-340/tokenizer.json b/checkpoint-340/tokenizer.json
new file mode 100644
index 0000000000000000000000000000000000000000..1c1d8d5c9024994f1d3b00f9662b8dd89ca13cf2
--- /dev/null
+++ b/checkpoint-340/tokenizer.json
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:6b9e4e7fb171f92fd137b777cc2714bf87d11576700a1dcd7a399e7bbe39537b
+size 17209920
diff --git a/checkpoint-340/tokenizer_config.json b/checkpoint-340/tokenizer_config.json
new file mode 100644
index 0000000000000000000000000000000000000000..ca91a2ef55f4239a7af81d7c9abb05f53621a07b
--- /dev/null
+++ b/checkpoint-340/tokenizer_config.json
@@ -0,0 +1,2064 @@
+{
+ "added_tokens_decoder": {
+ "128000": {
+ "content": "<|begin_of_text|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128001": {
+ "content": "<|end_of_text|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128002": {
+ "content": "<|reserved_special_token_0|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128003": {
+ "content": "<|reserved_special_token_1|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128004": {
+ "content": "<|finetune_right_pad_id|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128005": {
+ "content": "<|reserved_special_token_2|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128006": {
+ "content": "<|start_header_id|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128007": {
+ "content": "<|end_header_id|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128008": {
+ "content": "<|eom_id|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128009": {
+ "content": "<|eot_id|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128010": {
+ "content": "<|python_tag|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128011": {
+ "content": "<|reserved_special_token_3|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128012": {
+ "content": "<|reserved_special_token_4|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128013": {
+ "content": "<|reserved_special_token_5|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128014": {
+ "content": "<|reserved_special_token_6|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128015": {
+ "content": "<|reserved_special_token_7|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128016": {
+ "content": "<|reserved_special_token_8|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128017": {
+ "content": "<|reserved_special_token_9|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128018": {
+ "content": "<|reserved_special_token_10|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128019": {
+ "content": "<|reserved_special_token_11|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128020": {
+ "content": "<|reserved_special_token_12|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128021": {
+ "content": "<|reserved_special_token_13|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128022": {
+ "content": "<|reserved_special_token_14|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128023": {
+ "content": "<|reserved_special_token_15|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128024": {
+ "content": "<|reserved_special_token_16|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128025": {
+ "content": "<|reserved_special_token_17|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128026": {
+ "content": "<|reserved_special_token_18|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128027": {
+ "content": "<|reserved_special_token_19|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128028": {
+ "content": "<|reserved_special_token_20|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128029": {
+ "content": "<|reserved_special_token_21|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128030": {
+ "content": "<|reserved_special_token_22|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128031": {
+ "content": "<|reserved_special_token_23|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128032": {
+ "content": "<|reserved_special_token_24|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128033": {
+ "content": "<|reserved_special_token_25|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128034": {
+ "content": "<|reserved_special_token_26|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128035": {
+ "content": "<|reserved_special_token_27|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128036": {
+ "content": "<|reserved_special_token_28|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128037": {
+ "content": "<|reserved_special_token_29|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128038": {
+ "content": "<|reserved_special_token_30|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128039": {
+ "content": "<|reserved_special_token_31|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128040": {
+ "content": "<|reserved_special_token_32|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128041": {
+ "content": "<|reserved_special_token_33|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128042": {
+ "content": "<|reserved_special_token_34|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128043": {
+ "content": "<|reserved_special_token_35|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128044": {
+ "content": "<|reserved_special_token_36|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128045": {
+ "content": "<|reserved_special_token_37|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128046": {
+ "content": "<|reserved_special_token_38|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128047": {
+ "content": "<|reserved_special_token_39|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128048": {
+ "content": "<|reserved_special_token_40|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128049": {
+ "content": "<|reserved_special_token_41|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128050": {
+ "content": "<|reserved_special_token_42|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128051": {
+ "content": "<|reserved_special_token_43|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128052": {
+ "content": "<|reserved_special_token_44|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128053": {
+ "content": "<|reserved_special_token_45|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128054": {
+ "content": "<|reserved_special_token_46|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128055": {
+ "content": "<|reserved_special_token_47|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128056": {
+ "content": "<|reserved_special_token_48|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128057": {
+ "content": "<|reserved_special_token_49|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128058": {
+ "content": "<|reserved_special_token_50|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128059": {
+ "content": "<|reserved_special_token_51|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128060": {
+ "content": "<|reserved_special_token_52|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128061": {
+ "content": "<|reserved_special_token_53|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128062": {
+ "content": "<|reserved_special_token_54|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128063": {
+ "content": "<|reserved_special_token_55|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128064": {
+ "content": "<|reserved_special_token_56|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128065": {
+ "content": "<|reserved_special_token_57|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128066": {
+ "content": "<|reserved_special_token_58|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128067": {
+ "content": "<|reserved_special_token_59|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128068": {
+ "content": "<|reserved_special_token_60|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128069": {
+ "content": "<|reserved_special_token_61|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128070": {
+ "content": "<|reserved_special_token_62|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128071": {
+ "content": "<|reserved_special_token_63|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128072": {
+ "content": "<|reserved_special_token_64|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128073": {
+ "content": "<|reserved_special_token_65|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128074": {
+ "content": "<|reserved_special_token_66|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128075": {
+ "content": "<|reserved_special_token_67|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128076": {
+ "content": "<|reserved_special_token_68|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128077": {
+ "content": "<|reserved_special_token_69|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128078": {
+ "content": "<|reserved_special_token_70|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128079": {
+ "content": "<|reserved_special_token_71|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128080": {
+ "content": "<|reserved_special_token_72|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128081": {
+ "content": "<|reserved_special_token_73|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128082": {
+ "content": "<|reserved_special_token_74|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128083": {
+ "content": "<|reserved_special_token_75|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128084": {
+ "content": "<|reserved_special_token_76|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128085": {
+ "content": "<|reserved_special_token_77|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128086": {
+ "content": "<|reserved_special_token_78|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128087": {
+ "content": "<|reserved_special_token_79|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128088": {
+ "content": "<|reserved_special_token_80|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128089": {
+ "content": "<|reserved_special_token_81|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128090": {
+ "content": "<|reserved_special_token_82|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128091": {
+ "content": "<|reserved_special_token_83|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128092": {
+ "content": "<|reserved_special_token_84|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128093": {
+ "content": "<|reserved_special_token_85|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128094": {
+ "content": "<|reserved_special_token_86|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128095": {
+ "content": "<|reserved_special_token_87|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128096": {
+ "content": "<|reserved_special_token_88|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128097": {
+ "content": "<|reserved_special_token_89|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128098": {
+ "content": "<|reserved_special_token_90|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128099": {
+ "content": "<|reserved_special_token_91|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128100": {
+ "content": "<|reserved_special_token_92|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128101": {
+ "content": "<|reserved_special_token_93|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128102": {
+ "content": "<|reserved_special_token_94|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128103": {
+ "content": "<|reserved_special_token_95|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128104": {
+ "content": "<|reserved_special_token_96|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128105": {
+ "content": "<|reserved_special_token_97|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128106": {
+ "content": "<|reserved_special_token_98|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128107": {
+ "content": "<|reserved_special_token_99|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128108": {
+ "content": "<|reserved_special_token_100|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128109": {
+ "content": "<|reserved_special_token_101|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128110": {
+ "content": "<|reserved_special_token_102|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128111": {
+ "content": "<|reserved_special_token_103|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128112": {
+ "content": "<|reserved_special_token_104|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128113": {
+ "content": "<|reserved_special_token_105|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128114": {
+ "content": "<|reserved_special_token_106|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128115": {
+ "content": "<|reserved_special_token_107|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128116": {
+ "content": "<|reserved_special_token_108|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128117": {
+ "content": "<|reserved_special_token_109|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128118": {
+ "content": "<|reserved_special_token_110|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128119": {
+ "content": "<|reserved_special_token_111|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128120": {
+ "content": "<|reserved_special_token_112|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128121": {
+ "content": "<|reserved_special_token_113|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128122": {
+ "content": "<|reserved_special_token_114|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128123": {
+ "content": "<|reserved_special_token_115|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128124": {
+ "content": "<|reserved_special_token_116|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128125": {
+ "content": "<|reserved_special_token_117|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128126": {
+ "content": "<|reserved_special_token_118|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128127": {
+ "content": "<|reserved_special_token_119|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128128": {
+ "content": "<|reserved_special_token_120|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128129": {
+ "content": "<|reserved_special_token_121|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128130": {
+ "content": "<|reserved_special_token_122|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128131": {
+ "content": "<|reserved_special_token_123|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128132": {
+ "content": "<|reserved_special_token_124|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128133": {
+ "content": "<|reserved_special_token_125|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128134": {
+ "content": "<|reserved_special_token_126|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128135": {
+ "content": "<|reserved_special_token_127|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128136": {
+ "content": "<|reserved_special_token_128|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128137": {
+ "content": "<|reserved_special_token_129|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128138": {
+ "content": "<|reserved_special_token_130|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128139": {
+ "content": "<|reserved_special_token_131|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128140": {
+ "content": "<|reserved_special_token_132|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128141": {
+ "content": "<|reserved_special_token_133|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128142": {
+ "content": "<|reserved_special_token_134|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128143": {
+ "content": "<|reserved_special_token_135|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128144": {
+ "content": "<|reserved_special_token_136|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128145": {
+ "content": "<|reserved_special_token_137|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128146": {
+ "content": "<|reserved_special_token_138|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128147": {
+ "content": "<|reserved_special_token_139|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128148": {
+ "content": "<|reserved_special_token_140|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128149": {
+ "content": "<|reserved_special_token_141|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128150": {
+ "content": "<|reserved_special_token_142|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128151": {
+ "content": "<|reserved_special_token_143|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128152": {
+ "content": "<|reserved_special_token_144|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128153": {
+ "content": "<|reserved_special_token_145|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128154": {
+ "content": "<|reserved_special_token_146|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128155": {
+ "content": "<|reserved_special_token_147|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128156": {
+ "content": "<|reserved_special_token_148|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128157": {
+ "content": "<|reserved_special_token_149|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128158": {
+ "content": "<|reserved_special_token_150|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128159": {
+ "content": "<|reserved_special_token_151|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128160": {
+ "content": "<|reserved_special_token_152|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128161": {
+ "content": "<|reserved_special_token_153|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128162": {
+ "content": "<|reserved_special_token_154|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128163": {
+ "content": "<|reserved_special_token_155|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128164": {
+ "content": "<|reserved_special_token_156|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128165": {
+ "content": "<|reserved_special_token_157|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128166": {
+ "content": "<|reserved_special_token_158|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128167": {
+ "content": "<|reserved_special_token_159|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128168": {
+ "content": "<|reserved_special_token_160|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128169": {
+ "content": "<|reserved_special_token_161|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128170": {
+ "content": "<|reserved_special_token_162|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128171": {
+ "content": "<|reserved_special_token_163|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128172": {
+ "content": "<|reserved_special_token_164|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128173": {
+ "content": "<|reserved_special_token_165|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128174": {
+ "content": "<|reserved_special_token_166|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128175": {
+ "content": "<|reserved_special_token_167|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128176": {
+ "content": "<|reserved_special_token_168|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128177": {
+ "content": "<|reserved_special_token_169|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128178": {
+ "content": "<|reserved_special_token_170|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128179": {
+ "content": "<|reserved_special_token_171|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128180": {
+ "content": "<|reserved_special_token_172|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128181": {
+ "content": "<|reserved_special_token_173|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128182": {
+ "content": "<|reserved_special_token_174|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128183": {
+ "content": "<|reserved_special_token_175|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128184": {
+ "content": "<|reserved_special_token_176|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128185": {
+ "content": "<|reserved_special_token_177|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128186": {
+ "content": "<|reserved_special_token_178|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128187": {
+ "content": "<|reserved_special_token_179|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128188": {
+ "content": "<|reserved_special_token_180|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128189": {
+ "content": "<|reserved_special_token_181|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128190": {
+ "content": "<|reserved_special_token_182|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128191": {
+ "content": "<|reserved_special_token_183|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128192": {
+ "content": "<|reserved_special_token_184|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128193": {
+ "content": "<|reserved_special_token_185|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128194": {
+ "content": "<|reserved_special_token_186|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128195": {
+ "content": "<|reserved_special_token_187|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128196": {
+ "content": "<|reserved_special_token_188|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128197": {
+ "content": "<|reserved_special_token_189|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128198": {
+ "content": "<|reserved_special_token_190|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128199": {
+ "content": "<|reserved_special_token_191|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128200": {
+ "content": "<|reserved_special_token_192|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128201": {
+ "content": "<|reserved_special_token_193|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128202": {
+ "content": "<|reserved_special_token_194|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128203": {
+ "content": "<|reserved_special_token_195|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128204": {
+ "content": "<|reserved_special_token_196|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128205": {
+ "content": "<|reserved_special_token_197|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128206": {
+ "content": "<|reserved_special_token_198|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128207": {
+ "content": "<|reserved_special_token_199|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128208": {
+ "content": "<|reserved_special_token_200|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128209": {
+ "content": "<|reserved_special_token_201|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128210": {
+ "content": "<|reserved_special_token_202|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128211": {
+ "content": "<|reserved_special_token_203|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128212": {
+ "content": "<|reserved_special_token_204|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128213": {
+ "content": "<|reserved_special_token_205|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128214": {
+ "content": "<|reserved_special_token_206|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128215": {
+ "content": "<|reserved_special_token_207|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128216": {
+ "content": "<|reserved_special_token_208|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128217": {
+ "content": "<|reserved_special_token_209|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128218": {
+ "content": "<|reserved_special_token_210|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128219": {
+ "content": "<|reserved_special_token_211|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128220": {
+ "content": "<|reserved_special_token_212|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128221": {
+ "content": "<|reserved_special_token_213|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128222": {
+ "content": "<|reserved_special_token_214|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128223": {
+ "content": "<|reserved_special_token_215|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128224": {
+ "content": "<|reserved_special_token_216|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128225": {
+ "content": "<|reserved_special_token_217|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128226": {
+ "content": "<|reserved_special_token_218|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128227": {
+ "content": "<|reserved_special_token_219|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128228": {
+ "content": "<|reserved_special_token_220|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128229": {
+ "content": "<|reserved_special_token_221|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128230": {
+ "content": "<|reserved_special_token_222|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128231": {
+ "content": "<|reserved_special_token_223|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128232": {
+ "content": "<|reserved_special_token_224|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128233": {
+ "content": "<|reserved_special_token_225|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128234": {
+ "content": "<|reserved_special_token_226|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128235": {
+ "content": "<|reserved_special_token_227|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128236": {
+ "content": "<|reserved_special_token_228|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128237": {
+ "content": "<|reserved_special_token_229|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128238": {
+ "content": "<|reserved_special_token_230|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128239": {
+ "content": "<|reserved_special_token_231|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128240": {
+ "content": "<|reserved_special_token_232|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128241": {
+ "content": "<|reserved_special_token_233|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128242": {
+ "content": "<|reserved_special_token_234|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128243": {
+ "content": "<|reserved_special_token_235|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128244": {
+ "content": "<|reserved_special_token_236|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128245": {
+ "content": "<|reserved_special_token_237|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128246": {
+ "content": "<|reserved_special_token_238|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128247": {
+ "content": "<|reserved_special_token_239|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128248": {
+ "content": "<|reserved_special_token_240|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128249": {
+ "content": "<|reserved_special_token_241|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128250": {
+ "content": "<|reserved_special_token_242|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128251": {
+ "content": "<|reserved_special_token_243|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128252": {
+ "content": "<|reserved_special_token_244|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128253": {
+ "content": "<|reserved_special_token_245|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128254": {
+ "content": "<|reserved_special_token_246|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128255": {
+ "content": "<|reserved_special_token_247|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ }
+ },
+ "bos_token": "<|begin_of_text|>",
+ "chat_template": "{{- bos_token }}\n{%- if custom_tools is defined %}\n {%- set tools = custom_tools %}\n{%- endif %}\n{%- if not tools_in_user_message is defined %}\n {%- set tools_in_user_message = true %}\n{%- endif %}\n{%- if not date_string is defined %}\n {%- set date_string = \"26 Jul 2024\" %}\n{%- endif %}\n{%- if not tools is defined %}\n {%- set tools = none %}\n{%- endif %}\n\n{#- This block extracts the system message, so we can slot it into the right place. #}\n{%- if messages[0]['role'] == 'system' %}\n {%- set system_message = messages[0]['content']|trim %}\n {%- set messages = messages[1:] %}\n{%- else %}\n {%- set system_message = \"\" %}\n{%- endif %}\n\n{#- System message + builtin tools #}\n{{- \"<|start_header_id|>system<|end_header_id|>\\n\\n\" }}\n{%- if builtin_tools is defined or tools is not none %}\n {{- \"Environment: ipython\\n\" }}\n{%- endif %}\n{%- if builtin_tools is defined %}\n {{- \"Tools: \" + builtin_tools | reject('equalto', 'code_interpreter') | join(\", \") + \"\\n\\n\"}}\n{%- endif %}\n{{- \"Cutting Knowledge Date: December 2023\\n\" }}\n{{- \"Today Date: \" + date_string + \"\\n\\n\" }}\n{%- if tools is not none and not tools_in_user_message %}\n {{- \"You have access to the following functions. To call a function, please respond with JSON for a function call.\" }}\n {{- 'Respond in the format {\"name\": function name, \"parameters\": dictionary of argument name and its value}.' }}\n {{- \"Do not use variables.\\n\\n\" }}\n {%- for t in tools %}\n {{- t | tojson(indent=4) }}\n {{- \"\\n\\n\" }}\n {%- endfor %}\n{%- endif %}\n{{- system_message }}\n{{- \"<|eot_id|>\" }}\n\n{#- Custom tools are passed in a user message with some extra guidance #}\n{%- if tools_in_user_message and not tools is none %}\n {#- Extract the first user message so we can plug it in here #}\n {%- if messages | length != 0 %}\n {%- set first_user_message = messages[0]['content']|trim %}\n {%- set messages = messages[1:] %}\n {%- else %}\n {{- raise_exception(\"Cannot put tools in the first user message when there's no first user message!\") }}\n{%- endif %}\n {{- '<|start_header_id|>user<|end_header_id|>\\n\\n' -}}\n {{- \"Given the following functions, please respond with a JSON for a function call \" }}\n {{- \"with its proper arguments that best answers the given prompt.\\n\\n\" }}\n {{- 'Respond in the format {\"name\": function name, \"parameters\": dictionary of argument name and its value}.' }}\n {{- \"Do not use variables.\\n\\n\" }}\n {%- for t in tools %}\n {{- t | tojson(indent=4) }}\n {{- \"\\n\\n\" }}\n {%- endfor %}\n {{- first_user_message + \"<|eot_id|>\"}}\n{%- endif %}\n\n{%- for message in messages %}\n {%- if not (message.role == 'ipython' or message.role == 'tool' or 'tool_calls' in message) %}\n {{- '<|start_header_id|>' + message['role'] + '<|end_header_id|>\\n\\n'+ message['content'] | trim + '<|eot_id|>' }}\n {%- elif 'tool_calls' in message %}\n {%- if not message.tool_calls|length == 1 %}\n {{- raise_exception(\"This model only supports single tool-calls at once!\") }}\n {%- endif %}\n {%- set tool_call = message.tool_calls[0].function %}\n {%- if builtin_tools is defined and tool_call.name in builtin_tools %}\n {{- '<|start_header_id|>assistant<|end_header_id|>\\n\\n' -}}\n {{- \"<|python_tag|>\" + tool_call.name + \".call(\" }}\n {%- for arg_name, arg_val in tool_call.arguments | items %}\n {{- arg_name + '=\"' + arg_val + '\"' }}\n {%- if not loop.last %}\n {{- \", \" }}\n {%- endif %}\n {%- endfor %}\n {{- \")\" }}\n {%- else %}\n {{- '<|start_header_id|>assistant<|end_header_id|>\\n\\n' -}}\n {{- '{\"name\": \"' + tool_call.name + '\", ' }}\n {{- '\"parameters\": ' }}\n {{- tool_call.arguments | tojson }}\n {{- \"}\" }}\n {%- endif %}\n {%- if builtin_tools is defined %}\n {#- This means we're in ipython mode #}\n {{- \"<|eom_id|>\" }}\n {%- else %}\n {{- \"<|eot_id|>\" }}\n {%- endif %}\n {%- elif message.role == \"tool\" or message.role == \"ipython\" %}\n {{- \"<|start_header_id|>ipython<|end_header_id|>\\n\\n\" }}\n {%- if message.content is mapping or message.content is iterable %}\n {{- message.content | tojson }}\n {%- else %}\n {{- message.content }}\n {%- endif %}\n {{- \"<|eot_id|>\" }}\n {%- endif %}\n{%- endfor %}\n{%- if add_generation_prompt %}\n {{- '<|start_header_id|>assistant<|end_header_id|>\\n\\n' }}\n{%- endif %}\n",
+ "clean_up_tokenization_spaces": true,
+ "eos_token": "<|eot_id|>",
+ "extra_special_tokens": {},
+ "model_input_names": [
+ "input_ids",
+ "attention_mask"
+ ],
+ "model_max_length": 131072,
+ "pad_token": "<|end_of_text|>",
+ "tokenizer_class": "PreTrainedTokenizer"
+}
diff --git a/checkpoint-340/trainer_state.json b/checkpoint-340/trainer_state.json
new file mode 100644
index 0000000000000000000000000000000000000000..fb844fd4cd33711880f240ef8dd57ddce41dd5bc
--- /dev/null
+++ b/checkpoint-340/trainer_state.json
@@ -0,0 +1,2413 @@
+{
+ "best_metric": null,
+ "best_model_checkpoint": null,
+ "epoch": 3.9609375,
+ "eval_steps": 500,
+ "global_step": 340,
+ "is_hyper_param_search": false,
+ "is_local_process_zero": true,
+ "is_world_process_zero": true,
+ "log_history": [
+ {
+ "epoch": 0.01171875,
+ "grad_norm": 36.23282241821289,
+ "learning_rate": 5.0000000000000004e-08,
+ "loss": 2.3839,
+ "step": 1
+ },
+ {
+ "epoch": 0.0234375,
+ "grad_norm": 35.918636322021484,
+ "learning_rate": 1.0000000000000001e-07,
+ "loss": 2.3798,
+ "step": 2
+ },
+ {
+ "epoch": 0.03515625,
+ "grad_norm": 35.62618637084961,
+ "learning_rate": 1.5000000000000002e-07,
+ "loss": 2.386,
+ "step": 3
+ },
+ {
+ "epoch": 0.046875,
+ "grad_norm": 35.966087341308594,
+ "learning_rate": 2.0000000000000002e-07,
+ "loss": 2.3803,
+ "step": 4
+ },
+ {
+ "epoch": 0.05859375,
+ "grad_norm": 35.38177490234375,
+ "learning_rate": 2.5000000000000004e-07,
+ "loss": 2.3937,
+ "step": 5
+ },
+ {
+ "epoch": 0.0703125,
+ "grad_norm": 35.99677658081055,
+ "learning_rate": 3.0000000000000004e-07,
+ "loss": 2.3906,
+ "step": 6
+ },
+ {
+ "epoch": 0.08203125,
+ "grad_norm": 35.44341278076172,
+ "learning_rate": 3.5000000000000004e-07,
+ "loss": 2.3539,
+ "step": 7
+ },
+ {
+ "epoch": 0.09375,
+ "grad_norm": 35.300697326660156,
+ "learning_rate": 4.0000000000000003e-07,
+ "loss": 2.3459,
+ "step": 8
+ },
+ {
+ "epoch": 0.10546875,
+ "grad_norm": 34.092952728271484,
+ "learning_rate": 4.5000000000000003e-07,
+ "loss": 2.2959,
+ "step": 9
+ },
+ {
+ "epoch": 0.1171875,
+ "grad_norm": 34.46371841430664,
+ "learning_rate": 5.000000000000001e-07,
+ "loss": 2.2661,
+ "step": 10
+ },
+ {
+ "epoch": 0.12890625,
+ "grad_norm": 34.62260818481445,
+ "learning_rate": 5.5e-07,
+ "loss": 2.2918,
+ "step": 11
+ },
+ {
+ "epoch": 0.140625,
+ "grad_norm": 33.790374755859375,
+ "learning_rate": 6.000000000000001e-07,
+ "loss": 2.223,
+ "step": 12
+ },
+ {
+ "epoch": 0.15234375,
+ "grad_norm": 33.766536712646484,
+ "learning_rate": 6.5e-07,
+ "loss": 2.2267,
+ "step": 13
+ },
+ {
+ "epoch": 0.1640625,
+ "grad_norm": 33.894081115722656,
+ "learning_rate": 7.000000000000001e-07,
+ "loss": 2.1465,
+ "step": 14
+ },
+ {
+ "epoch": 0.17578125,
+ "grad_norm": 33.162452697753906,
+ "learning_rate": 7.5e-07,
+ "loss": 2.0495,
+ "step": 15
+ },
+ {
+ "epoch": 0.1875,
+ "grad_norm": 32.954341888427734,
+ "learning_rate": 8.000000000000001e-07,
+ "loss": 1.9627,
+ "step": 16
+ },
+ {
+ "epoch": 0.19921875,
+ "grad_norm": 33.96324157714844,
+ "learning_rate": 8.500000000000001e-07,
+ "loss": 1.8867,
+ "step": 17
+ },
+ {
+ "epoch": 0.2109375,
+ "grad_norm": 33.81139373779297,
+ "learning_rate": 9.000000000000001e-07,
+ "loss": 1.7752,
+ "step": 18
+ },
+ {
+ "epoch": 0.22265625,
+ "grad_norm": 34.87086868286133,
+ "learning_rate": 9.500000000000001e-07,
+ "loss": 1.6944,
+ "step": 19
+ },
+ {
+ "epoch": 0.234375,
+ "grad_norm": 34.84965133666992,
+ "learning_rate": 1.0000000000000002e-06,
+ "loss": 1.5707,
+ "step": 20
+ },
+ {
+ "epoch": 0.24609375,
+ "grad_norm": 35.227317810058594,
+ "learning_rate": 1.0500000000000001e-06,
+ "loss": 1.4369,
+ "step": 21
+ },
+ {
+ "epoch": 0.2578125,
+ "grad_norm": 34.91344451904297,
+ "learning_rate": 1.1e-06,
+ "loss": 1.3202,
+ "step": 22
+ },
+ {
+ "epoch": 0.26953125,
+ "grad_norm": 31.7376766204834,
+ "learning_rate": 1.1500000000000002e-06,
+ "loss": 1.1398,
+ "step": 23
+ },
+ {
+ "epoch": 0.28125,
+ "grad_norm": 30.24741554260254,
+ "learning_rate": 1.2000000000000002e-06,
+ "loss": 1.0421,
+ "step": 24
+ },
+ {
+ "epoch": 0.29296875,
+ "grad_norm": 28.292400360107422,
+ "learning_rate": 1.25e-06,
+ "loss": 0.8817,
+ "step": 25
+ },
+ {
+ "epoch": 0.3046875,
+ "grad_norm": 30.44672393798828,
+ "learning_rate": 1.3e-06,
+ "loss": 0.7073,
+ "step": 26
+ },
+ {
+ "epoch": 0.31640625,
+ "grad_norm": 29.416427612304688,
+ "learning_rate": 1.3500000000000002e-06,
+ "loss": 0.5444,
+ "step": 27
+ },
+ {
+ "epoch": 0.328125,
+ "grad_norm": 24.820096969604492,
+ "learning_rate": 1.4000000000000001e-06,
+ "loss": 0.4025,
+ "step": 28
+ },
+ {
+ "epoch": 0.33984375,
+ "grad_norm": 21.023277282714844,
+ "learning_rate": 1.45e-06,
+ "loss": 0.307,
+ "step": 29
+ },
+ {
+ "epoch": 0.3515625,
+ "grad_norm": 19.656967163085938,
+ "learning_rate": 1.5e-06,
+ "loss": 0.2151,
+ "step": 30
+ },
+ {
+ "epoch": 0.36328125,
+ "grad_norm": 14.91929817199707,
+ "learning_rate": 1.5500000000000002e-06,
+ "loss": 0.1448,
+ "step": 31
+ },
+ {
+ "epoch": 0.375,
+ "grad_norm": 5.083199977874756,
+ "learning_rate": 1.6000000000000001e-06,
+ "loss": 0.09,
+ "step": 32
+ },
+ {
+ "epoch": 0.38671875,
+ "grad_norm": 2.320681571960449,
+ "learning_rate": 1.6500000000000003e-06,
+ "loss": 0.0641,
+ "step": 33
+ },
+ {
+ "epoch": 0.3984375,
+ "grad_norm": 1.6233159303665161,
+ "learning_rate": 1.7000000000000002e-06,
+ "loss": 0.0584,
+ "step": 34
+ },
+ {
+ "epoch": 0.41015625,
+ "grad_norm": 1.6057201623916626,
+ "learning_rate": 1.75e-06,
+ "loss": 0.0626,
+ "step": 35
+ },
+ {
+ "epoch": 0.421875,
+ "grad_norm": 1.8360320329666138,
+ "learning_rate": 1.8000000000000001e-06,
+ "loss": 0.0563,
+ "step": 36
+ },
+ {
+ "epoch": 0.43359375,
+ "grad_norm": 1.736350178718567,
+ "learning_rate": 1.85e-06,
+ "loss": 0.0609,
+ "step": 37
+ },
+ {
+ "epoch": 0.4453125,
+ "grad_norm": 1.1473922729492188,
+ "learning_rate": 1.9000000000000002e-06,
+ "loss": 0.0541,
+ "step": 38
+ },
+ {
+ "epoch": 0.45703125,
+ "grad_norm": 1.1722168922424316,
+ "learning_rate": 1.9500000000000004e-06,
+ "loss": 0.0534,
+ "step": 39
+ },
+ {
+ "epoch": 0.46875,
+ "grad_norm": 1.356987714767456,
+ "learning_rate": 2.0000000000000003e-06,
+ "loss": 0.0496,
+ "step": 40
+ },
+ {
+ "epoch": 0.48046875,
+ "grad_norm": 0.8023216724395752,
+ "learning_rate": 2.05e-06,
+ "loss": 0.0527,
+ "step": 41
+ },
+ {
+ "epoch": 0.4921875,
+ "grad_norm": 0.9803515672683716,
+ "learning_rate": 2.1000000000000002e-06,
+ "loss": 0.0478,
+ "step": 42
+ },
+ {
+ "epoch": 0.50390625,
+ "grad_norm": 0.8733468651771545,
+ "learning_rate": 2.15e-06,
+ "loss": 0.052,
+ "step": 43
+ },
+ {
+ "epoch": 0.515625,
+ "grad_norm": 0.8213743567466736,
+ "learning_rate": 2.2e-06,
+ "loss": 0.0448,
+ "step": 44
+ },
+ {
+ "epoch": 0.52734375,
+ "grad_norm": 0.843189537525177,
+ "learning_rate": 2.25e-06,
+ "loss": 0.0498,
+ "step": 45
+ },
+ {
+ "epoch": 0.5390625,
+ "grad_norm": 0.8801079392433167,
+ "learning_rate": 2.3000000000000004e-06,
+ "loss": 0.0408,
+ "step": 46
+ },
+ {
+ "epoch": 0.55078125,
+ "grad_norm": 0.7131401300430298,
+ "learning_rate": 2.35e-06,
+ "loss": 0.0405,
+ "step": 47
+ },
+ {
+ "epoch": 0.5625,
+ "grad_norm": 0.8996126651763916,
+ "learning_rate": 2.4000000000000003e-06,
+ "loss": 0.0525,
+ "step": 48
+ },
+ {
+ "epoch": 0.57421875,
+ "grad_norm": 0.8606986403465271,
+ "learning_rate": 2.4500000000000003e-06,
+ "loss": 0.0438,
+ "step": 49
+ },
+ {
+ "epoch": 0.5859375,
+ "grad_norm": 0.6918051838874817,
+ "learning_rate": 2.5e-06,
+ "loss": 0.0394,
+ "step": 50
+ },
+ {
+ "epoch": 0.59765625,
+ "grad_norm": 0.6177802085876465,
+ "learning_rate": 2.55e-06,
+ "loss": 0.0387,
+ "step": 51
+ },
+ {
+ "epoch": 0.609375,
+ "grad_norm": 0.7042555809020996,
+ "learning_rate": 2.6e-06,
+ "loss": 0.0434,
+ "step": 52
+ },
+ {
+ "epoch": 0.62109375,
+ "grad_norm": 0.6537717580795288,
+ "learning_rate": 2.6500000000000005e-06,
+ "loss": 0.0396,
+ "step": 53
+ },
+ {
+ "epoch": 0.6328125,
+ "grad_norm": 0.7834082841873169,
+ "learning_rate": 2.7000000000000004e-06,
+ "loss": 0.0411,
+ "step": 54
+ },
+ {
+ "epoch": 0.64453125,
+ "grad_norm": 0.7287272810935974,
+ "learning_rate": 2.7500000000000004e-06,
+ "loss": 0.0408,
+ "step": 55
+ },
+ {
+ "epoch": 0.65625,
+ "grad_norm": 0.7186263203620911,
+ "learning_rate": 2.8000000000000003e-06,
+ "loss": 0.0394,
+ "step": 56
+ },
+ {
+ "epoch": 0.66796875,
+ "grad_norm": 0.7264899611473083,
+ "learning_rate": 2.85e-06,
+ "loss": 0.0427,
+ "step": 57
+ },
+ {
+ "epoch": 0.6796875,
+ "grad_norm": 0.7665618062019348,
+ "learning_rate": 2.9e-06,
+ "loss": 0.0368,
+ "step": 58
+ },
+ {
+ "epoch": 0.69140625,
+ "grad_norm": 0.7222962379455566,
+ "learning_rate": 2.95e-06,
+ "loss": 0.0412,
+ "step": 59
+ },
+ {
+ "epoch": 0.703125,
+ "grad_norm": 0.7061101794242859,
+ "learning_rate": 3e-06,
+ "loss": 0.0377,
+ "step": 60
+ },
+ {
+ "epoch": 0.71484375,
+ "grad_norm": 0.5724324584007263,
+ "learning_rate": 3.05e-06,
+ "loss": 0.0387,
+ "step": 61
+ },
+ {
+ "epoch": 0.7265625,
+ "grad_norm": 0.5535506010055542,
+ "learning_rate": 3.1000000000000004e-06,
+ "loss": 0.0403,
+ "step": 62
+ },
+ {
+ "epoch": 0.73828125,
+ "grad_norm": 0.6553678512573242,
+ "learning_rate": 3.1500000000000003e-06,
+ "loss": 0.0415,
+ "step": 63
+ },
+ {
+ "epoch": 0.75,
+ "grad_norm": 0.6137285828590393,
+ "learning_rate": 3.2000000000000003e-06,
+ "loss": 0.0383,
+ "step": 64
+ },
+ {
+ "epoch": 0.76171875,
+ "grad_norm": 0.5985754132270813,
+ "learning_rate": 3.2500000000000002e-06,
+ "loss": 0.0355,
+ "step": 65
+ },
+ {
+ "epoch": 0.7734375,
+ "grad_norm": 0.5903909802436829,
+ "learning_rate": 3.3000000000000006e-06,
+ "loss": 0.0374,
+ "step": 66
+ },
+ {
+ "epoch": 0.78515625,
+ "grad_norm": 0.5718765258789062,
+ "learning_rate": 3.3500000000000005e-06,
+ "loss": 0.0339,
+ "step": 67
+ },
+ {
+ "epoch": 0.796875,
+ "grad_norm": 0.6844965815544128,
+ "learning_rate": 3.4000000000000005e-06,
+ "loss": 0.0405,
+ "step": 68
+ },
+ {
+ "epoch": 0.80859375,
+ "grad_norm": 0.5959618091583252,
+ "learning_rate": 3.45e-06,
+ "loss": 0.0338,
+ "step": 69
+ },
+ {
+ "epoch": 0.8203125,
+ "grad_norm": 0.6095123291015625,
+ "learning_rate": 3.5e-06,
+ "loss": 0.0362,
+ "step": 70
+ },
+ {
+ "epoch": 0.83203125,
+ "grad_norm": 0.543708086013794,
+ "learning_rate": 3.5500000000000003e-06,
+ "loss": 0.0355,
+ "step": 71
+ },
+ {
+ "epoch": 0.84375,
+ "grad_norm": 0.6969983577728271,
+ "learning_rate": 3.6000000000000003e-06,
+ "loss": 0.0325,
+ "step": 72
+ },
+ {
+ "epoch": 0.85546875,
+ "grad_norm": 0.6022969484329224,
+ "learning_rate": 3.65e-06,
+ "loss": 0.0342,
+ "step": 73
+ },
+ {
+ "epoch": 0.8671875,
+ "grad_norm": 0.6262147426605225,
+ "learning_rate": 3.7e-06,
+ "loss": 0.0348,
+ "step": 74
+ },
+ {
+ "epoch": 0.87890625,
+ "grad_norm": 0.5729933381080627,
+ "learning_rate": 3.7500000000000005e-06,
+ "loss": 0.0318,
+ "step": 75
+ },
+ {
+ "epoch": 0.890625,
+ "grad_norm": 0.5846775770187378,
+ "learning_rate": 3.8000000000000005e-06,
+ "loss": 0.0309,
+ "step": 76
+ },
+ {
+ "epoch": 0.90234375,
+ "grad_norm": 0.6469219923019409,
+ "learning_rate": 3.85e-06,
+ "loss": 0.0324,
+ "step": 77
+ },
+ {
+ "epoch": 0.9140625,
+ "grad_norm": 0.6574859023094177,
+ "learning_rate": 3.900000000000001e-06,
+ "loss": 0.0325,
+ "step": 78
+ },
+ {
+ "epoch": 0.92578125,
+ "grad_norm": 0.5833832025527954,
+ "learning_rate": 3.95e-06,
+ "loss": 0.0232,
+ "step": 79
+ },
+ {
+ "epoch": 0.9375,
+ "grad_norm": 0.7503570318222046,
+ "learning_rate": 4.000000000000001e-06,
+ "loss": 0.0267,
+ "step": 80
+ },
+ {
+ "epoch": 0.94921875,
+ "grad_norm": 0.7181633114814758,
+ "learning_rate": 4.05e-06,
+ "loss": 0.0304,
+ "step": 81
+ },
+ {
+ "epoch": 0.9609375,
+ "grad_norm": 0.6477274298667908,
+ "learning_rate": 4.1e-06,
+ "loss": 0.0297,
+ "step": 82
+ },
+ {
+ "epoch": 0.97265625,
+ "grad_norm": 0.6768563389778137,
+ "learning_rate": 4.15e-06,
+ "loss": 0.0279,
+ "step": 83
+ },
+ {
+ "epoch": 0.984375,
+ "grad_norm": 0.7905837297439575,
+ "learning_rate": 4.2000000000000004e-06,
+ "loss": 0.0301,
+ "step": 84
+ },
+ {
+ "epoch": 0.99609375,
+ "grad_norm": 0.5576608777046204,
+ "learning_rate": 4.25e-06,
+ "loss": 0.0322,
+ "step": 85
+ },
+ {
+ "epoch": 1.0,
+ "grad_norm": 0.5576608777046204,
+ "learning_rate": 4.3e-06,
+ "loss": 0.0226,
+ "step": 86
+ },
+ {
+ "epoch": 1.01171875,
+ "grad_norm": 1.0774812698364258,
+ "learning_rate": 4.350000000000001e-06,
+ "loss": 0.0215,
+ "step": 87
+ },
+ {
+ "epoch": 1.0234375,
+ "grad_norm": 0.47373324632644653,
+ "learning_rate": 4.4e-06,
+ "loss": 0.0235,
+ "step": 88
+ },
+ {
+ "epoch": 1.03515625,
+ "grad_norm": 0.7665970325469971,
+ "learning_rate": 4.450000000000001e-06,
+ "loss": 0.0242,
+ "step": 89
+ },
+ {
+ "epoch": 1.046875,
+ "grad_norm": 0.6290147304534912,
+ "learning_rate": 4.5e-06,
+ "loss": 0.0209,
+ "step": 90
+ },
+ {
+ "epoch": 1.05859375,
+ "grad_norm": 0.5703024864196777,
+ "learning_rate": 4.5500000000000005e-06,
+ "loss": 0.0192,
+ "step": 91
+ },
+ {
+ "epoch": 1.0703125,
+ "grad_norm": 0.6099259853363037,
+ "learning_rate": 4.600000000000001e-06,
+ "loss": 0.0181,
+ "step": 92
+ },
+ {
+ "epoch": 1.08203125,
+ "grad_norm": 0.6570988297462463,
+ "learning_rate": 4.65e-06,
+ "loss": 0.0201,
+ "step": 93
+ },
+ {
+ "epoch": 1.09375,
+ "grad_norm": 0.7848325371742249,
+ "learning_rate": 4.7e-06,
+ "loss": 0.0253,
+ "step": 94
+ },
+ {
+ "epoch": 1.10546875,
+ "grad_norm": 0.6759209036827087,
+ "learning_rate": 4.75e-06,
+ "loss": 0.0195,
+ "step": 95
+ },
+ {
+ "epoch": 1.1171875,
+ "grad_norm": 0.4861151874065399,
+ "learning_rate": 4.800000000000001e-06,
+ "loss": 0.0191,
+ "step": 96
+ },
+ {
+ "epoch": 1.12890625,
+ "grad_norm": 0.6268576383590698,
+ "learning_rate": 4.85e-06,
+ "loss": 0.0211,
+ "step": 97
+ },
+ {
+ "epoch": 1.140625,
+ "grad_norm": 0.5862017869949341,
+ "learning_rate": 4.9000000000000005e-06,
+ "loss": 0.0177,
+ "step": 98
+ },
+ {
+ "epoch": 1.15234375,
+ "grad_norm": 0.4569724202156067,
+ "learning_rate": 4.95e-06,
+ "loss": 0.0164,
+ "step": 99
+ },
+ {
+ "epoch": 1.1640625,
+ "grad_norm": 0.4539048969745636,
+ "learning_rate": 5e-06,
+ "loss": 0.0152,
+ "step": 100
+ },
+ {
+ "epoch": 1.17578125,
+ "grad_norm": 0.4553528428077698,
+ "learning_rate": 4.999926609487568e-06,
+ "loss": 0.0208,
+ "step": 101
+ },
+ {
+ "epoch": 1.1875,
+ "grad_norm": 0.5182592272758484,
+ "learning_rate": 4.999706442259205e-06,
+ "loss": 0.0154,
+ "step": 102
+ },
+ {
+ "epoch": 1.19921875,
+ "grad_norm": 0.5602673888206482,
+ "learning_rate": 4.999339511241458e-06,
+ "loss": 0.0196,
+ "step": 103
+ },
+ {
+ "epoch": 1.2109375,
+ "grad_norm": 0.7579494118690491,
+ "learning_rate": 4.9988258379777334e-06,
+ "loss": 0.0198,
+ "step": 104
+ },
+ {
+ "epoch": 1.22265625,
+ "grad_norm": 0.603757381439209,
+ "learning_rate": 4.998165452627025e-06,
+ "loss": 0.0185,
+ "step": 105
+ },
+ {
+ "epoch": 1.234375,
+ "grad_norm": 0.5520291924476624,
+ "learning_rate": 4.99735839396215e-06,
+ "loss": 0.018,
+ "step": 106
+ },
+ {
+ "epoch": 1.24609375,
+ "grad_norm": 0.55808424949646,
+ "learning_rate": 4.996404709367466e-06,
+ "loss": 0.0159,
+ "step": 107
+ },
+ {
+ "epoch": 1.2578125,
+ "grad_norm": 0.47174298763275146,
+ "learning_rate": 4.995304454836095e-06,
+ "loss": 0.0122,
+ "step": 108
+ },
+ {
+ "epoch": 1.26953125,
+ "grad_norm": 0.5289337038993835,
+ "learning_rate": 4.994057694966632e-06,
+ "loss": 0.0168,
+ "step": 109
+ },
+ {
+ "epoch": 1.28125,
+ "grad_norm": 0.5390430092811584,
+ "learning_rate": 4.992664502959351e-06,
+ "loss": 0.017,
+ "step": 110
+ },
+ {
+ "epoch": 1.29296875,
+ "grad_norm": 0.4966451823711395,
+ "learning_rate": 4.991124960611916e-06,
+ "loss": 0.0145,
+ "step": 111
+ },
+ {
+ "epoch": 1.3046875,
+ "grad_norm": 0.6148604154586792,
+ "learning_rate": 4.989439158314566e-06,
+ "loss": 0.0139,
+ "step": 112
+ },
+ {
+ "epoch": 1.31640625,
+ "grad_norm": 0.6303534507751465,
+ "learning_rate": 4.9876071950448185e-06,
+ "loss": 0.0118,
+ "step": 113
+ },
+ {
+ "epoch": 1.328125,
+ "grad_norm": 0.5410207509994507,
+ "learning_rate": 4.98562917836165e-06,
+ "loss": 0.0094,
+ "step": 114
+ },
+ {
+ "epoch": 1.33984375,
+ "grad_norm": 0.5350080132484436,
+ "learning_rate": 4.983505224399188e-06,
+ "loss": 0.0158,
+ "step": 115
+ },
+ {
+ "epoch": 1.3515625,
+ "grad_norm": 1.017317295074463,
+ "learning_rate": 4.9812354578598876e-06,
+ "loss": 0.0201,
+ "step": 116
+ },
+ {
+ "epoch": 1.36328125,
+ "grad_norm": 0.6891007423400879,
+ "learning_rate": 4.978820012007213e-06,
+ "loss": 0.0127,
+ "step": 117
+ },
+ {
+ "epoch": 1.375,
+ "grad_norm": 0.4756389260292053,
+ "learning_rate": 4.976259028657812e-06,
+ "loss": 0.0188,
+ "step": 118
+ },
+ {
+ "epoch": 1.38671875,
+ "grad_norm": 0.5957350730895996,
+ "learning_rate": 4.973552658173186e-06,
+ "loss": 0.011,
+ "step": 119
+ },
+ {
+ "epoch": 1.3984375,
+ "grad_norm": 0.5012223720550537,
+ "learning_rate": 4.970701059450872e-06,
+ "loss": 0.0138,
+ "step": 120
+ },
+ {
+ "epoch": 1.41015625,
+ "grad_norm": 0.4408419132232666,
+ "learning_rate": 4.9677043999151e-06,
+ "loss": 0.0144,
+ "step": 121
+ },
+ {
+ "epoch": 1.421875,
+ "grad_norm": 0.5721736550331116,
+ "learning_rate": 4.964562855506976e-06,
+ "loss": 0.0135,
+ "step": 122
+ },
+ {
+ "epoch": 1.43359375,
+ "grad_norm": 0.5479208827018738,
+ "learning_rate": 4.961276610674141e-06,
+ "loss": 0.0128,
+ "step": 123
+ },
+ {
+ "epoch": 1.4453125,
+ "grad_norm": 1.0117675065994263,
+ "learning_rate": 4.9578458583599495e-06,
+ "loss": 0.0111,
+ "step": 124
+ },
+ {
+ "epoch": 1.45703125,
+ "grad_norm": 0.5504026412963867,
+ "learning_rate": 4.954270799992138e-06,
+ "loss": 0.0083,
+ "step": 125
+ },
+ {
+ "epoch": 1.46875,
+ "grad_norm": 0.48403099179267883,
+ "learning_rate": 4.950551645470998e-06,
+ "loss": 0.0083,
+ "step": 126
+ },
+ {
+ "epoch": 1.48046875,
+ "grad_norm": 0.6866800785064697,
+ "learning_rate": 4.9466886131570565e-06,
+ "loss": 0.0085,
+ "step": 127
+ },
+ {
+ "epoch": 1.4921875,
+ "grad_norm": 0.872557520866394,
+ "learning_rate": 4.942681929858249e-06,
+ "loss": 0.0102,
+ "step": 128
+ },
+ {
+ "epoch": 1.50390625,
+ "grad_norm": 0.6924716234207153,
+ "learning_rate": 4.9385318308166065e-06,
+ "loss": 0.012,
+ "step": 129
+ },
+ {
+ "epoch": 1.515625,
+ "grad_norm": 0.5060118436813354,
+ "learning_rate": 4.934238559694448e-06,
+ "loss": 0.0084,
+ "step": 130
+ },
+ {
+ "epoch": 1.52734375,
+ "grad_norm": 0.6256171464920044,
+ "learning_rate": 4.929802368560066e-06,
+ "loss": 0.0081,
+ "step": 131
+ },
+ {
+ "epoch": 1.5390625,
+ "grad_norm": 0.5422537922859192,
+ "learning_rate": 4.925223517872934e-06,
+ "loss": 0.0077,
+ "step": 132
+ },
+ {
+ "epoch": 1.55078125,
+ "grad_norm": 0.953416109085083,
+ "learning_rate": 4.920502276468408e-06,
+ "loss": 0.0078,
+ "step": 133
+ },
+ {
+ "epoch": 1.5625,
+ "grad_norm": 0.4540804624557495,
+ "learning_rate": 4.915638921541952e-06,
+ "loss": 0.0097,
+ "step": 134
+ },
+ {
+ "epoch": 1.57421875,
+ "grad_norm": 0.3773641884326935,
+ "learning_rate": 4.9106337386328524e-06,
+ "loss": 0.0098,
+ "step": 135
+ },
+ {
+ "epoch": 1.5859375,
+ "grad_norm": 0.7970175743103027,
+ "learning_rate": 4.905487021607462e-06,
+ "loss": 0.0056,
+ "step": 136
+ },
+ {
+ "epoch": 1.59765625,
+ "grad_norm": 0.45197635889053345,
+ "learning_rate": 4.900199072641937e-06,
+ "loss": 0.0078,
+ "step": 137
+ },
+ {
+ "epoch": 1.609375,
+ "grad_norm": 0.38231438398361206,
+ "learning_rate": 4.894770202204509e-06,
+ "loss": 0.0072,
+ "step": 138
+ },
+ {
+ "epoch": 1.62109375,
+ "grad_norm": 0.2945426404476166,
+ "learning_rate": 4.889200729037241e-06,
+ "loss": 0.0086,
+ "step": 139
+ },
+ {
+ "epoch": 1.6328125,
+ "grad_norm": 0.49699363112449646,
+ "learning_rate": 4.883490980137327e-06,
+ "loss": 0.0073,
+ "step": 140
+ },
+ {
+ "epoch": 1.64453125,
+ "grad_norm": 0.38112956285476685,
+ "learning_rate": 4.8776412907378845e-06,
+ "loss": 0.0056,
+ "step": 141
+ },
+ {
+ "epoch": 1.65625,
+ "grad_norm": 0.46780407428741455,
+ "learning_rate": 4.871652004288275e-06,
+ "loss": 0.0078,
+ "step": 142
+ },
+ {
+ "epoch": 1.66796875,
+ "grad_norm": 0.43764325976371765,
+ "learning_rate": 4.865523472433942e-06,
+ "loss": 0.005,
+ "step": 143
+ },
+ {
+ "epoch": 1.6796875,
+ "grad_norm": 0.3445664644241333,
+ "learning_rate": 4.859256054995758e-06,
+ "loss": 0.0069,
+ "step": 144
+ },
+ {
+ "epoch": 1.69140625,
+ "grad_norm": 0.40410447120666504,
+ "learning_rate": 4.8528501199489045e-06,
+ "loss": 0.0088,
+ "step": 145
+ },
+ {
+ "epoch": 1.703125,
+ "grad_norm": 0.5876736640930176,
+ "learning_rate": 4.846306043401268e-06,
+ "loss": 0.0057,
+ "step": 146
+ },
+ {
+ "epoch": 1.71484375,
+ "grad_norm": 0.5149250626564026,
+ "learning_rate": 4.839624209571352e-06,
+ "loss": 0.0056,
+ "step": 147
+ },
+ {
+ "epoch": 1.7265625,
+ "grad_norm": 0.7009180784225464,
+ "learning_rate": 4.832805010765724e-06,
+ "loss": 0.0088,
+ "step": 148
+ },
+ {
+ "epoch": 1.73828125,
+ "grad_norm": 0.42258334159851074,
+ "learning_rate": 4.8258488473559794e-06,
+ "loss": 0.004,
+ "step": 149
+ },
+ {
+ "epoch": 1.75,
+ "grad_norm": 0.39231887459754944,
+ "learning_rate": 4.8187561277552376e-06,
+ "loss": 0.005,
+ "step": 150
+ },
+ {
+ "epoch": 1.76171875,
+ "grad_norm": 0.3317432701587677,
+ "learning_rate": 4.811527268394157e-06,
+ "loss": 0.0038,
+ "step": 151
+ },
+ {
+ "epoch": 1.7734375,
+ "grad_norm": 0.5022267699241638,
+ "learning_rate": 4.804162693696494e-06,
+ "loss": 0.0056,
+ "step": 152
+ },
+ {
+ "epoch": 1.78515625,
+ "grad_norm": 0.39019322395324707,
+ "learning_rate": 4.796662836054176e-06,
+ "loss": 0.0053,
+ "step": 153
+ },
+ {
+ "epoch": 1.796875,
+ "grad_norm": 0.5674042701721191,
+ "learning_rate": 4.789028135801919e-06,
+ "loss": 0.007,
+ "step": 154
+ },
+ {
+ "epoch": 1.80859375,
+ "grad_norm": 0.5690024495124817,
+ "learning_rate": 4.7812590411913755e-06,
+ "loss": 0.0053,
+ "step": 155
+ },
+ {
+ "epoch": 1.8203125,
+ "grad_norm": 0.23775412142276764,
+ "learning_rate": 4.773356008364812e-06,
+ "loss": 0.0031,
+ "step": 156
+ },
+ {
+ "epoch": 1.83203125,
+ "grad_norm": 0.4698558747768402,
+ "learning_rate": 4.765319501328332e-06,
+ "loss": 0.0021,
+ "step": 157
+ },
+ {
+ "epoch": 1.84375,
+ "grad_norm": 0.21603639423847198,
+ "learning_rate": 4.757149991924633e-06,
+ "loss": 0.0046,
+ "step": 158
+ },
+ {
+ "epoch": 1.85546875,
+ "grad_norm": 0.33830726146698,
+ "learning_rate": 4.748847959805297e-06,
+ "loss": 0.0022,
+ "step": 159
+ },
+ {
+ "epoch": 1.8671875,
+ "grad_norm": 0.44919782876968384,
+ "learning_rate": 4.740413892402639e-06,
+ "loss": 0.0032,
+ "step": 160
+ },
+ {
+ "epoch": 1.87890625,
+ "grad_norm": 0.5119614601135254,
+ "learning_rate": 4.731848284901082e-06,
+ "loss": 0.006,
+ "step": 161
+ },
+ {
+ "epoch": 1.890625,
+ "grad_norm": 0.3875437080860138,
+ "learning_rate": 4.723151640208084e-06,
+ "loss": 0.0024,
+ "step": 162
+ },
+ {
+ "epoch": 1.90234375,
+ "grad_norm": 0.3179910182952881,
+ "learning_rate": 4.714324468924614e-06,
+ "loss": 0.0037,
+ "step": 163
+ },
+ {
+ "epoch": 1.9140625,
+ "grad_norm": 0.43395644426345825,
+ "learning_rate": 4.705367289315172e-06,
+ "loss": 0.0027,
+ "step": 164
+ },
+ {
+ "epoch": 1.92578125,
+ "grad_norm": 0.3703945577144623,
+ "learning_rate": 4.696280627277356e-06,
+ "loss": 0.0047,
+ "step": 165
+ },
+ {
+ "epoch": 1.9375,
+ "grad_norm": 0.2503529191017151,
+ "learning_rate": 4.687065016310996e-06,
+ "loss": 0.0052,
+ "step": 166
+ },
+ {
+ "epoch": 1.94921875,
+ "grad_norm": 0.3613075315952301,
+ "learning_rate": 4.6777209974868194e-06,
+ "loss": 0.0034,
+ "step": 167
+ },
+ {
+ "epoch": 1.9609375,
+ "grad_norm": 0.3578515350818634,
+ "learning_rate": 4.668249119414692e-06,
+ "loss": 0.0021,
+ "step": 168
+ },
+ {
+ "epoch": 1.97265625,
+ "grad_norm": 0.1784515529870987,
+ "learning_rate": 4.6586499382113985e-06,
+ "loss": 0.0018,
+ "step": 169
+ },
+ {
+ "epoch": 1.984375,
+ "grad_norm": 0.259198397397995,
+ "learning_rate": 4.648924017468003e-06,
+ "loss": 0.0009,
+ "step": 170
+ },
+ {
+ "epoch": 1.99609375,
+ "grad_norm": 0.7194133400917053,
+ "learning_rate": 4.6390719282167515e-06,
+ "loss": 0.0041,
+ "step": 171
+ },
+ {
+ "epoch": 2.0,
+ "grad_norm": 0.7194133400917053,
+ "learning_rate": 4.629094248897546e-06,
+ "loss": 0.0014,
+ "step": 172
+ },
+ {
+ "epoch": 2.01171875,
+ "grad_norm": 0.5032601952552795,
+ "learning_rate": 4.618991565323987e-06,
+ "loss": 0.0028,
+ "step": 173
+ },
+ {
+ "epoch": 2.0234375,
+ "grad_norm": 0.6387512683868408,
+ "learning_rate": 4.608764470648971e-06,
+ "loss": 0.0007,
+ "step": 174
+ },
+ {
+ "epoch": 2.03515625,
+ "grad_norm": 0.23177844285964966,
+ "learning_rate": 4.598413565329876e-06,
+ "loss": 0.0006,
+ "step": 175
+ },
+ {
+ "epoch": 2.046875,
+ "grad_norm": 0.1713147759437561,
+ "learning_rate": 4.587939457093296e-06,
+ "loss": 0.0003,
+ "step": 176
+ },
+ {
+ "epoch": 2.05859375,
+ "grad_norm": 0.06128697097301483,
+ "learning_rate": 4.577342760899368e-06,
+ "loss": 0.0001,
+ "step": 177
+ },
+ {
+ "epoch": 2.0703125,
+ "grad_norm": 0.538530170917511,
+ "learning_rate": 4.566624098905665e-06,
+ "loss": 0.0004,
+ "step": 178
+ },
+ {
+ "epoch": 2.08203125,
+ "grad_norm": 0.03301696106791496,
+ "learning_rate": 4.555784100430662e-06,
+ "loss": 0.0004,
+ "step": 179
+ },
+ {
+ "epoch": 2.09375,
+ "grad_norm": 0.21366432309150696,
+ "learning_rate": 4.544823401916794e-06,
+ "loss": 0.0014,
+ "step": 180
+ },
+ {
+ "epoch": 2.10546875,
+ "grad_norm": 0.13440090417861938,
+ "learning_rate": 4.533742646893086e-06,
+ "loss": 0.0004,
+ "step": 181
+ },
+ {
+ "epoch": 2.1171875,
+ "grad_norm": 0.531997799873352,
+ "learning_rate": 4.522542485937369e-06,
+ "loss": 0.0008,
+ "step": 182
+ },
+ {
+ "epoch": 2.12890625,
+ "grad_norm": 0.2832719385623932,
+ "learning_rate": 4.511223576638084e-06,
+ "loss": 0.0023,
+ "step": 183
+ },
+ {
+ "epoch": 2.140625,
+ "grad_norm": 0.3814002275466919,
+ "learning_rate": 4.499786583555675e-06,
+ "loss": 0.001,
+ "step": 184
+ },
+ {
+ "epoch": 2.15234375,
+ "grad_norm": 0.2522885501384735,
+ "learning_rate": 4.4882321781835666e-06,
+ "loss": 0.0004,
+ "step": 185
+ },
+ {
+ "epoch": 2.1640625,
+ "grad_norm": 0.3866797983646393,
+ "learning_rate": 4.476561038908745e-06,
+ "loss": 0.0007,
+ "step": 186
+ },
+ {
+ "epoch": 2.17578125,
+ "grad_norm": 0.2128417044878006,
+ "learning_rate": 4.464773850971924e-06,
+ "loss": 0.0001,
+ "step": 187
+ },
+ {
+ "epoch": 2.1875,
+ "grad_norm": 0.135880708694458,
+ "learning_rate": 4.452871306427314e-06,
+ "loss": 0.0031,
+ "step": 188
+ },
+ {
+ "epoch": 2.19921875,
+ "grad_norm": 0.38835451006889343,
+ "learning_rate": 4.440854104101988e-06,
+ "loss": 0.0015,
+ "step": 189
+ },
+ {
+ "epoch": 2.2109375,
+ "grad_norm": 0.18233123421669006,
+ "learning_rate": 4.428722949554858e-06,
+ "loss": 0.0001,
+ "step": 190
+ },
+ {
+ "epoch": 2.22265625,
+ "grad_norm": 0.10753051191568375,
+ "learning_rate": 4.416478555035241e-06,
+ "loss": 0.0017,
+ "step": 191
+ },
+ {
+ "epoch": 2.234375,
+ "grad_norm": 0.30138343572616577,
+ "learning_rate": 4.404121639441047e-06,
+ "loss": 0.0004,
+ "step": 192
+ },
+ {
+ "epoch": 2.24609375,
+ "grad_norm": 0.12771356105804443,
+ "learning_rate": 4.391652928276572e-06,
+ "loss": 0.0022,
+ "step": 193
+ },
+ {
+ "epoch": 2.2578125,
+ "grad_norm": 0.4173564612865448,
+ "learning_rate": 4.379073153609896e-06,
+ "loss": 0.0001,
+ "step": 194
+ },
+ {
+ "epoch": 2.26953125,
+ "grad_norm": 0.08329658955335617,
+ "learning_rate": 4.366383054029907e-06,
+ "loss": 0.0009,
+ "step": 195
+ },
+ {
+ "epoch": 2.28125,
+ "grad_norm": 0.21187439560890198,
+ "learning_rate": 4.3535833746029335e-06,
+ "loss": 0.0013,
+ "step": 196
+ },
+ {
+ "epoch": 2.29296875,
+ "grad_norm": 0.046030864119529724,
+ "learning_rate": 4.340674866829001e-06,
+ "loss": 0.0004,
+ "step": 197
+ },
+ {
+ "epoch": 2.3046875,
+ "grad_norm": 0.08373020589351654,
+ "learning_rate": 4.32765828859771e-06,
+ "loss": 0.0014,
+ "step": 198
+ },
+ {
+ "epoch": 2.31640625,
+ "grad_norm": 0.4026390314102173,
+ "learning_rate": 4.314534404143738e-06,
+ "loss": 0.0003,
+ "step": 199
+ },
+ {
+ "epoch": 2.328125,
+ "grad_norm": 0.24255593121051788,
+ "learning_rate": 4.3013039840019675e-06,
+ "loss": 0.0009,
+ "step": 200
+ },
+ {
+ "epoch": 2.33984375,
+ "grad_norm": 0.2282780110836029,
+ "learning_rate": 4.287967804962252e-06,
+ "loss": 0.0025,
+ "step": 201
+ },
+ {
+ "epoch": 2.3515625,
+ "grad_norm": 0.14743350446224213,
+ "learning_rate": 4.274526650023801e-06,
+ "loss": 0.0014,
+ "step": 202
+ },
+ {
+ "epoch": 2.36328125,
+ "grad_norm": 0.17971713840961456,
+ "learning_rate": 4.260981308349214e-06,
+ "loss": 0.0003,
+ "step": 203
+ },
+ {
+ "epoch": 2.375,
+ "grad_norm": 0.03872796148061752,
+ "learning_rate": 4.247332575218144e-06,
+ "loss": 0.0003,
+ "step": 204
+ },
+ {
+ "epoch": 2.38671875,
+ "grad_norm": 0.06636863946914673,
+ "learning_rate": 4.233581251980604e-06,
+ "loss": 0.0004,
+ "step": 205
+ },
+ {
+ "epoch": 2.3984375,
+ "grad_norm": 0.1254304051399231,
+ "learning_rate": 4.2197281460099245e-06,
+ "loss": 0.0002,
+ "step": 206
+ },
+ {
+ "epoch": 2.41015625,
+ "grad_norm": 0.03998701646924019,
+ "learning_rate": 4.2057740706553415e-06,
+ "loss": 0.0007,
+ "step": 207
+ },
+ {
+ "epoch": 2.421875,
+ "grad_norm": 0.8734745979309082,
+ "learning_rate": 4.191719845194246e-06,
+ "loss": 0.0019,
+ "step": 208
+ },
+ {
+ "epoch": 2.43359375,
+ "grad_norm": 0.34975236654281616,
+ "learning_rate": 4.177566294784085e-06,
+ "loss": 0.0006,
+ "step": 209
+ },
+ {
+ "epoch": 2.4453125,
+ "grad_norm": 0.07566183060407639,
+ "learning_rate": 4.163314250413913e-06,
+ "loss": 0.0003,
+ "step": 210
+ },
+ {
+ "epoch": 2.45703125,
+ "grad_norm": 0.09056711941957474,
+ "learning_rate": 4.148964548855603e-06,
+ "loss": 0.0002,
+ "step": 211
+ },
+ {
+ "epoch": 2.46875,
+ "grad_norm": 0.16160684823989868,
+ "learning_rate": 4.134518032614713e-06,
+ "loss": 0.0009,
+ "step": 212
+ },
+ {
+ "epoch": 2.48046875,
+ "grad_norm": 0.0812753438949585,
+ "learning_rate": 4.119975549881029e-06,
+ "loss": 0.0002,
+ "step": 213
+ },
+ {
+ "epoch": 2.4921875,
+ "grad_norm": 0.05827738344669342,
+ "learning_rate": 4.105337954478756e-06,
+ "loss": 0.0007,
+ "step": 214
+ },
+ {
+ "epoch": 2.50390625,
+ "grad_norm": 0.2625848054885864,
+ "learning_rate": 4.0906061058164e-06,
+ "loss": 0.0003,
+ "step": 215
+ },
+ {
+ "epoch": 2.515625,
+ "grad_norm": 0.1771923154592514,
+ "learning_rate": 4.075780868836296e-06,
+ "loss": 0.0005,
+ "step": 216
+ },
+ {
+ "epoch": 2.52734375,
+ "grad_norm": 0.034166041761636734,
+ "learning_rate": 4.060863113963835e-06,
+ "loss": 0.0012,
+ "step": 217
+ },
+ {
+ "epoch": 2.5390625,
+ "grad_norm": 0.14099521934986115,
+ "learning_rate": 4.045853717056358e-06,
+ "loss": 0.0,
+ "step": 218
+ },
+ {
+ "epoch": 2.55078125,
+ "grad_norm": 0.34704917669296265,
+ "learning_rate": 4.030753559351728e-06,
+ "loss": 0.0006,
+ "step": 219
+ },
+ {
+ "epoch": 2.5625,
+ "grad_norm": 0.25681111216545105,
+ "learning_rate": 4.015563527416596e-06,
+ "loss": 0.0004,
+ "step": 220
+ },
+ {
+ "epoch": 2.57421875,
+ "grad_norm": 0.36212408542633057,
+ "learning_rate": 4.000284513094342e-06,
+ "loss": 0.0003,
+ "step": 221
+ },
+ {
+ "epoch": 2.5859375,
+ "grad_norm": 0.13945375382900238,
+ "learning_rate": 3.984917413452721e-06,
+ "loss": 0.0001,
+ "step": 222
+ },
+ {
+ "epoch": 2.59765625,
+ "grad_norm": 0.06798060238361359,
+ "learning_rate": 3.969463130731183e-06,
+ "loss": 0.0007,
+ "step": 223
+ },
+ {
+ "epoch": 2.609375,
+ "grad_norm": 0.19848179817199707,
+ "learning_rate": 3.953922572287915e-06,
+ "loss": 0.0007,
+ "step": 224
+ },
+ {
+ "epoch": 2.62109375,
+ "grad_norm": 0.5454645156860352,
+ "learning_rate": 3.938296650546552e-06,
+ "loss": 0.0018,
+ "step": 225
+ },
+ {
+ "epoch": 2.6328125,
+ "grad_norm": 0.22043731808662415,
+ "learning_rate": 3.9225862829426184e-06,
+ "loss": 0.0036,
+ "step": 226
+ },
+ {
+ "epoch": 2.64453125,
+ "grad_norm": 0.3086087107658386,
+ "learning_rate": 3.906792391869657e-06,
+ "loss": 0.0002,
+ "step": 227
+ },
+ {
+ "epoch": 2.65625,
+ "grad_norm": 0.04387599974870682,
+ "learning_rate": 3.890915904625075e-06,
+ "loss": 0.0014,
+ "step": 228
+ },
+ {
+ "epoch": 2.66796875,
+ "grad_norm": 0.3786030113697052,
+ "learning_rate": 3.874957753355701e-06,
+ "loss": 0.0014,
+ "step": 229
+ },
+ {
+ "epoch": 2.6796875,
+ "grad_norm": 0.28310713171958923,
+ "learning_rate": 3.858918875003053e-06,
+ "loss": 0.0001,
+ "step": 230
+ },
+ {
+ "epoch": 2.69140625,
+ "grad_norm": 0.0586460717022419,
+ "learning_rate": 3.842800211248333e-06,
+ "loss": 0.0001,
+ "step": 231
+ },
+ {
+ "epoch": 2.703125,
+ "grad_norm": 0.11408677697181702,
+ "learning_rate": 3.8266027084571335e-06,
+ "loss": 0.001,
+ "step": 232
+ },
+ {
+ "epoch": 2.71484375,
+ "grad_norm": 0.06875021010637283,
+ "learning_rate": 3.810327317623881e-06,
+ "loss": 0.0001,
+ "step": 233
+ },
+ {
+ "epoch": 2.7265625,
+ "grad_norm": 0.037388525903224945,
+ "learning_rate": 3.793974994315991e-06,
+ "loss": 0.0002,
+ "step": 234
+ },
+ {
+ "epoch": 2.73828125,
+ "grad_norm": 0.041430581361055374,
+ "learning_rate": 3.7775466986177763e-06,
+ "loss": 0.0015,
+ "step": 235
+ },
+ {
+ "epoch": 2.75,
+ "grad_norm": 0.26019373536109924,
+ "learning_rate": 3.7610433950740667e-06,
+ "loss": 0.0022,
+ "step": 236
+ },
+ {
+ "epoch": 2.76171875,
+ "grad_norm": 0.16638831794261932,
+ "learning_rate": 3.7444660526335853e-06,
+ "loss": 0.0001,
+ "step": 237
+ },
+ {
+ "epoch": 2.7734375,
+ "grad_norm": 0.11822371184825897,
+ "learning_rate": 3.7278156445920584e-06,
+ "loss": 0.0004,
+ "step": 238
+ },
+ {
+ "epoch": 2.78515625,
+ "grad_norm": 0.055076126009225845,
+ "learning_rate": 3.711093148535068e-06,
+ "loss": 0.0001,
+ "step": 239
+ },
+ {
+ "epoch": 2.796875,
+ "grad_norm": 0.08209875971078873,
+ "learning_rate": 3.6942995462806574e-06,
+ "loss": 0.0012,
+ "step": 240
+ },
+ {
+ "epoch": 2.80859375,
+ "grad_norm": 0.10523220896720886,
+ "learning_rate": 3.6774358238216878e-06,
+ "loss": 0.0004,
+ "step": 241
+ },
+ {
+ "epoch": 2.8203125,
+ "grad_norm": 0.09211058169603348,
+ "learning_rate": 3.660502971267945e-06,
+ "loss": 0.0007,
+ "step": 242
+ },
+ {
+ "epoch": 2.83203125,
+ "grad_norm": 0.6209844946861267,
+ "learning_rate": 3.6435019827880093e-06,
+ "loss": 0.0004,
+ "step": 243
+ },
+ {
+ "epoch": 2.84375,
+ "grad_norm": 0.030900023877620697,
+ "learning_rate": 3.626433856550886e-06,
+ "loss": 0.0002,
+ "step": 244
+ },
+ {
+ "epoch": 2.85546875,
+ "grad_norm": 0.041130077093839645,
+ "learning_rate": 3.6092995946673996e-06,
+ "loss": 0.0003,
+ "step": 245
+ },
+ {
+ "epoch": 2.8671875,
+ "grad_norm": 0.052536819130182266,
+ "learning_rate": 3.5921002031313586e-06,
+ "loss": 0.0001,
+ "step": 246
+ },
+ {
+ "epoch": 2.87890625,
+ "grad_norm": 0.027478178963065147,
+ "learning_rate": 3.574836691760489e-06,
+ "loss": 0.0011,
+ "step": 247
+ },
+ {
+ "epoch": 2.890625,
+ "grad_norm": 0.11695867031812668,
+ "learning_rate": 3.557510074137147e-06,
+ "loss": 0.0002,
+ "step": 248
+ },
+ {
+ "epoch": 2.90234375,
+ "grad_norm": 0.08782754838466644,
+ "learning_rate": 3.540121367548811e-06,
+ "loss": 0.001,
+ "step": 249
+ },
+ {
+ "epoch": 2.9140625,
+ "grad_norm": 0.19123269617557526,
+ "learning_rate": 3.5226715929283507e-06,
+ "loss": 0.0001,
+ "step": 250
+ },
+ {
+ "epoch": 2.92578125,
+ "grad_norm": 0.020774945616722107,
+ "learning_rate": 3.505161774794085e-06,
+ "loss": 0.0006,
+ "step": 251
+ },
+ {
+ "epoch": 2.9375,
+ "grad_norm": 0.12062892317771912,
+ "learning_rate": 3.487592941189636e-06,
+ "loss": 0.0001,
+ "step": 252
+ },
+ {
+ "epoch": 2.94921875,
+ "grad_norm": 0.013076180592179298,
+ "learning_rate": 3.469966123623563e-06,
+ "loss": 0.0011,
+ "step": 253
+ },
+ {
+ "epoch": 2.9609375,
+ "grad_norm": 0.22065430879592896,
+ "learning_rate": 3.4522823570088073e-06,
+ "loss": 0.0001,
+ "step": 254
+ },
+ {
+ "epoch": 2.97265625,
+ "grad_norm": 0.027459079399704933,
+ "learning_rate": 3.434542679601922e-06,
+ "loss": 0.0003,
+ "step": 255
+ },
+ {
+ "epoch": 2.984375,
+ "grad_norm": 0.07469172775745392,
+ "learning_rate": 3.4167481329421204e-06,
+ "loss": 0.0005,
+ "step": 256
+ },
+ {
+ "epoch": 2.99609375,
+ "grad_norm": 0.544292688369751,
+ "learning_rate": 3.39889976179012e-06,
+ "loss": 0.0001,
+ "step": 257
+ },
+ {
+ "epoch": 3.0,
+ "grad_norm": 0.02610701508820057,
+ "learning_rate": 3.380998614066805e-06,
+ "loss": 0.0,
+ "step": 258
+ },
+ {
+ "epoch": 3.01171875,
+ "grad_norm": 0.016433028504252434,
+ "learning_rate": 3.363045740791698e-06,
+ "loss": 0.0,
+ "step": 259
+ },
+ {
+ "epoch": 3.0234375,
+ "grad_norm": 0.009407744742929935,
+ "learning_rate": 3.345042196021257e-06,
+ "loss": 0.0,
+ "step": 260
+ },
+ {
+ "epoch": 3.03515625,
+ "grad_norm": 0.009587760083377361,
+ "learning_rate": 3.326989036786981e-06,
+ "loss": 0.0,
+ "step": 261
+ },
+ {
+ "epoch": 3.046875,
+ "grad_norm": 0.021458568051457405,
+ "learning_rate": 3.3088873230333562e-06,
+ "loss": 0.0001,
+ "step": 262
+ },
+ {
+ "epoch": 3.05859375,
+ "grad_norm": 1.3090940713882446,
+ "learning_rate": 3.290738117555622e-06,
+ "loss": 0.0007,
+ "step": 263
+ },
+ {
+ "epoch": 3.0703125,
+ "grad_norm": 0.008000005036592484,
+ "learning_rate": 3.272542485937369e-06,
+ "loss": 0.0,
+ "step": 264
+ },
+ {
+ "epoch": 3.08203125,
+ "grad_norm": 0.11048968136310577,
+ "learning_rate": 3.2543014964879814e-06,
+ "loss": 0.0004,
+ "step": 265
+ },
+ {
+ "epoch": 3.09375,
+ "grad_norm": 0.010688518173992634,
+ "learning_rate": 3.2360162201799085e-06,
+ "loss": 0.0,
+ "step": 266
+ },
+ {
+ "epoch": 3.10546875,
+ "grad_norm": 0.0585443377494812,
+ "learning_rate": 3.21768773058579e-06,
+ "loss": 0.0001,
+ "step": 267
+ },
+ {
+ "epoch": 3.1171875,
+ "grad_norm": 0.12098421901464462,
+ "learning_rate": 3.1993171038154203e-06,
+ "loss": 0.0002,
+ "step": 268
+ },
+ {
+ "epoch": 3.12890625,
+ "grad_norm": 0.01194986142218113,
+ "learning_rate": 3.180905418452569e-06,
+ "loss": 0.0,
+ "step": 269
+ },
+ {
+ "epoch": 3.140625,
+ "grad_norm": 0.0898946076631546,
+ "learning_rate": 3.162453755491655e-06,
+ "loss": 0.0011,
+ "step": 270
+ },
+ {
+ "epoch": 3.15234375,
+ "grad_norm": 0.04248907417058945,
+ "learning_rate": 3.143963198274278e-06,
+ "loss": 0.0001,
+ "step": 271
+ },
+ {
+ "epoch": 3.1640625,
+ "grad_norm": 0.11775418370962143,
+ "learning_rate": 3.125434832425613e-06,
+ "loss": 0.0002,
+ "step": 272
+ },
+ {
+ "epoch": 3.17578125,
+ "grad_norm": 0.009955376386642456,
+ "learning_rate": 3.1068697457906736e-06,
+ "loss": 0.0,
+ "step": 273
+ },
+ {
+ "epoch": 3.1875,
+ "grad_norm": 0.010195266455411911,
+ "learning_rate": 3.0882690283704355e-06,
+ "loss": 0.0,
+ "step": 274
+ },
+ {
+ "epoch": 3.19921875,
+ "grad_norm": 0.0036824019625782967,
+ "learning_rate": 3.0696337722578444e-06,
+ "loss": 0.0,
+ "step": 275
+ },
+ {
+ "epoch": 3.2109375,
+ "grad_norm": 0.004132798407226801,
+ "learning_rate": 3.0509650715736977e-06,
+ "loss": 0.0,
+ "step": 276
+ },
+ {
+ "epoch": 3.22265625,
+ "grad_norm": 0.0651523619890213,
+ "learning_rate": 3.0322640224024024e-06,
+ "loss": 0.0001,
+ "step": 277
+ },
+ {
+ "epoch": 3.234375,
+ "grad_norm": 0.015174048021435738,
+ "learning_rate": 3.0135317227276247e-06,
+ "loss": 0.0,
+ "step": 278
+ },
+ {
+ "epoch": 3.24609375,
+ "grad_norm": 0.004420771263539791,
+ "learning_rate": 2.994769272367822e-06,
+ "loss": 0.0,
+ "step": 279
+ },
+ {
+ "epoch": 3.2578125,
+ "grad_norm": 0.019537663087248802,
+ "learning_rate": 2.975977772911671e-06,
+ "loss": 0.0001,
+ "step": 280
+ },
+ {
+ "epoch": 3.26953125,
+ "grad_norm": 0.005312444642186165,
+ "learning_rate": 2.9571583276533923e-06,
+ "loss": 0.0,
+ "step": 281
+ },
+ {
+ "epoch": 3.28125,
+ "grad_norm": 0.005001228302717209,
+ "learning_rate": 2.93831204152797e-06,
+ "loss": 0.0,
+ "step": 282
+ },
+ {
+ "epoch": 3.29296875,
+ "grad_norm": 0.02515912428498268,
+ "learning_rate": 2.9194400210462808e-06,
+ "loss": 0.0,
+ "step": 283
+ },
+ {
+ "epoch": 3.3046875,
+ "grad_norm": 0.0026461018715053797,
+ "learning_rate": 2.9005433742301274e-06,
+ "loss": 0.0,
+ "step": 284
+ },
+ {
+ "epoch": 3.31640625,
+ "grad_norm": 0.008561859838664532,
+ "learning_rate": 2.8816232105471864e-06,
+ "loss": 0.0,
+ "step": 285
+ },
+ {
+ "epoch": 3.328125,
+ "grad_norm": 0.0016494860174134374,
+ "learning_rate": 2.8626806408458626e-06,
+ "loss": 0.0,
+ "step": 286
+ },
+ {
+ "epoch": 3.33984375,
+ "grad_norm": 0.13021136820316315,
+ "learning_rate": 2.843716777290074e-06,
+ "loss": 0.0007,
+ "step": 287
+ },
+ {
+ "epoch": 3.3515625,
+ "grad_norm": 0.0030203904025256634,
+ "learning_rate": 2.8247327332939512e-06,
+ "loss": 0.0,
+ "step": 288
+ },
+ {
+ "epoch": 3.36328125,
+ "grad_norm": 0.03953886777162552,
+ "learning_rate": 2.805729623456469e-06,
+ "loss": 0.0,
+ "step": 289
+ },
+ {
+ "epoch": 3.375,
+ "grad_norm": 0.016400372609496117,
+ "learning_rate": 2.786708563496002e-06,
+ "loss": 0.0,
+ "step": 290
+ },
+ {
+ "epoch": 3.38671875,
+ "grad_norm": 0.0036580052692443132,
+ "learning_rate": 2.7676706701848187e-06,
+ "loss": 0.0,
+ "step": 291
+ },
+ {
+ "epoch": 3.3984375,
+ "grad_norm": 0.013516291044652462,
+ "learning_rate": 2.748617061283518e-06,
+ "loss": 0.0,
+ "step": 292
+ },
+ {
+ "epoch": 3.41015625,
+ "grad_norm": 0.0161955077201128,
+ "learning_rate": 2.7295488554753957e-06,
+ "loss": 0.0,
+ "step": 293
+ },
+ {
+ "epoch": 3.421875,
+ "grad_norm": 0.030412085354328156,
+ "learning_rate": 2.710467172300768e-06,
+ "loss": 0.0,
+ "step": 294
+ },
+ {
+ "epoch": 3.43359375,
+ "grad_norm": 0.009741670452058315,
+ "learning_rate": 2.69137313209124e-06,
+ "loss": 0.0,
+ "step": 295
+ },
+ {
+ "epoch": 3.4453125,
+ "grad_norm": 0.0022640388924628496,
+ "learning_rate": 2.672267855903927e-06,
+ "loss": 0.0,
+ "step": 296
+ },
+ {
+ "epoch": 3.45703125,
+ "grad_norm": 0.004546131007373333,
+ "learning_rate": 2.653152465455639e-06,
+ "loss": 0.0,
+ "step": 297
+ },
+ {
+ "epoch": 3.46875,
+ "grad_norm": 0.00977818388491869,
+ "learning_rate": 2.6340280830570142e-06,
+ "loss": 0.0,
+ "step": 298
+ },
+ {
+ "epoch": 3.48046875,
+ "grad_norm": 0.00292399013414979,
+ "learning_rate": 2.614895831546633e-06,
+ "loss": 0.0,
+ "step": 299
+ },
+ {
+ "epoch": 3.4921875,
+ "grad_norm": 0.02362428605556488,
+ "learning_rate": 2.595756834225089e-06,
+ "loss": 0.0001,
+ "step": 300
+ },
+ {
+ "epoch": 3.50390625,
+ "grad_norm": 0.05170333385467529,
+ "learning_rate": 2.576612214789039e-06,
+ "loss": 0.0001,
+ "step": 301
+ },
+ {
+ "epoch": 3.515625,
+ "grad_norm": 0.002428271807730198,
+ "learning_rate": 2.5574630972652263e-06,
+ "loss": 0.0,
+ "step": 302
+ },
+ {
+ "epoch": 3.52734375,
+ "grad_norm": 0.0020236221607774496,
+ "learning_rate": 2.538310605944491e-06,
+ "loss": 0.0,
+ "step": 303
+ },
+ {
+ "epoch": 3.5390625,
+ "grad_norm": 0.0026413940358906984,
+ "learning_rate": 2.5191558653157542e-06,
+ "loss": 0.0,
+ "step": 304
+ },
+ {
+ "epoch": 3.55078125,
+ "grad_norm": 0.001937767956405878,
+ "learning_rate": 2.5e-06,
+ "loss": 0.0,
+ "step": 305
+ },
+ {
+ "epoch": 3.5625,
+ "grad_norm": 0.013072842732071877,
+ "learning_rate": 2.480844134684246e-06,
+ "loss": 0.0,
+ "step": 306
+ },
+ {
+ "epoch": 3.57421875,
+ "grad_norm": 0.07046481966972351,
+ "learning_rate": 2.4616893940555094e-06,
+ "loss": 0.0003,
+ "step": 307
+ },
+ {
+ "epoch": 3.5859375,
+ "grad_norm": 0.002507950412109494,
+ "learning_rate": 2.4425369027347746e-06,
+ "loss": 0.0,
+ "step": 308
+ },
+ {
+ "epoch": 3.59765625,
+ "grad_norm": 0.0024932159576565027,
+ "learning_rate": 2.423387785210962e-06,
+ "loss": 0.0,
+ "step": 309
+ },
+ {
+ "epoch": 3.609375,
+ "grad_norm": 0.007839293219149113,
+ "learning_rate": 2.404243165774912e-06,
+ "loss": 0.0,
+ "step": 310
+ },
+ {
+ "epoch": 3.62109375,
+ "grad_norm": 0.008749544620513916,
+ "learning_rate": 2.3851041684533677e-06,
+ "loss": 0.0,
+ "step": 311
+ },
+ {
+ "epoch": 3.6328125,
+ "grad_norm": 0.00224123802036047,
+ "learning_rate": 2.3659719169429866e-06,
+ "loss": 0.0,
+ "step": 312
+ },
+ {
+ "epoch": 3.64453125,
+ "grad_norm": 0.0036495248787105083,
+ "learning_rate": 2.346847534544362e-06,
+ "loss": 0.0,
+ "step": 313
+ },
+ {
+ "epoch": 3.65625,
+ "grad_norm": 0.008617470040917397,
+ "learning_rate": 2.3277321440960733e-06,
+ "loss": 0.0,
+ "step": 314
+ },
+ {
+ "epoch": 3.66796875,
+ "grad_norm": 0.20711803436279297,
+ "learning_rate": 2.308626867908761e-06,
+ "loss": 0.0004,
+ "step": 315
+ },
+ {
+ "epoch": 3.6796875,
+ "grad_norm": 0.002029536757618189,
+ "learning_rate": 2.2895328276992325e-06,
+ "loss": 0.0,
+ "step": 316
+ },
+ {
+ "epoch": 3.69140625,
+ "grad_norm": 0.0029692472890019417,
+ "learning_rate": 2.270451144524605e-06,
+ "loss": 0.0,
+ "step": 317
+ },
+ {
+ "epoch": 3.703125,
+ "grad_norm": 0.003482841420918703,
+ "learning_rate": 2.251382938716482e-06,
+ "loss": 0.0,
+ "step": 318
+ },
+ {
+ "epoch": 3.71484375,
+ "grad_norm": 0.004736272618174553,
+ "learning_rate": 2.2323293298151817e-06,
+ "loss": 0.0,
+ "step": 319
+ },
+ {
+ "epoch": 3.7265625,
+ "grad_norm": 0.002524860203266144,
+ "learning_rate": 2.2132914365039993e-06,
+ "loss": 0.0,
+ "step": 320
+ },
+ {
+ "epoch": 3.73828125,
+ "grad_norm": 0.0024032641667872667,
+ "learning_rate": 2.1942703765435317e-06,
+ "loss": 0.0,
+ "step": 321
+ },
+ {
+ "epoch": 3.75,
+ "grad_norm": 0.06402894109487534,
+ "learning_rate": 2.1752672667060488e-06,
+ "loss": 0.0002,
+ "step": 322
+ },
+ {
+ "epoch": 3.76171875,
+ "grad_norm": 0.0013841127511113882,
+ "learning_rate": 2.1562832227099266e-06,
+ "loss": 0.0,
+ "step": 323
+ },
+ {
+ "epoch": 3.7734375,
+ "grad_norm": 0.002198501257225871,
+ "learning_rate": 2.137319359154138e-06,
+ "loss": 0.0,
+ "step": 324
+ },
+ {
+ "epoch": 3.78515625,
+ "grad_norm": 0.004288461524993181,
+ "learning_rate": 2.1183767894528135e-06,
+ "loss": 0.0,
+ "step": 325
+ },
+ {
+ "epoch": 3.796875,
+ "grad_norm": 0.16602352261543274,
+ "learning_rate": 2.099456625769872e-06,
+ "loss": 0.0003,
+ "step": 326
+ },
+ {
+ "epoch": 3.80859375,
+ "grad_norm": 0.001620235969312489,
+ "learning_rate": 2.08055997895372e-06,
+ "loss": 0.0,
+ "step": 327
+ },
+ {
+ "epoch": 3.8203125,
+ "grad_norm": 0.004387021530419588,
+ "learning_rate": 2.0616879584720305e-06,
+ "loss": 0.0,
+ "step": 328
+ },
+ {
+ "epoch": 3.83203125,
+ "grad_norm": 0.040472231805324554,
+ "learning_rate": 2.042841672346608e-06,
+ "loss": 0.0001,
+ "step": 329
+ },
+ {
+ "epoch": 3.84375,
+ "grad_norm": 0.03627858683466911,
+ "learning_rate": 2.024022227088329e-06,
+ "loss": 0.0001,
+ "step": 330
+ },
+ {
+ "epoch": 3.85546875,
+ "grad_norm": 0.0029672810342162848,
+ "learning_rate": 2.0052307276321793e-06,
+ "loss": 0.0,
+ "step": 331
+ },
+ {
+ "epoch": 3.8671875,
+ "grad_norm": 0.0023526407312601805,
+ "learning_rate": 1.9864682772723757e-06,
+ "loss": 0.0,
+ "step": 332
+ },
+ {
+ "epoch": 3.87890625,
+ "grad_norm": 0.001383278169669211,
+ "learning_rate": 1.967735977597598e-06,
+ "loss": 0.0,
+ "step": 333
+ },
+ {
+ "epoch": 3.890625,
+ "grad_norm": 0.002337483922019601,
+ "learning_rate": 1.9490349284263036e-06,
+ "loss": 0.0,
+ "step": 334
+ },
+ {
+ "epoch": 3.90234375,
+ "grad_norm": 0.02629532851278782,
+ "learning_rate": 1.930366227742157e-06,
+ "loss": 0.0,
+ "step": 335
+ },
+ {
+ "epoch": 3.9140625,
+ "grad_norm": 0.03508671000599861,
+ "learning_rate": 1.9117309716295658e-06,
+ "loss": 0.0001,
+ "step": 336
+ },
+ {
+ "epoch": 3.92578125,
+ "grad_norm": 0.0021862757857888937,
+ "learning_rate": 1.8931302542093274e-06,
+ "loss": 0.0,
+ "step": 337
+ },
+ {
+ "epoch": 3.9375,
+ "grad_norm": 0.002468815306201577,
+ "learning_rate": 1.8745651675743876e-06,
+ "loss": 0.0,
+ "step": 338
+ },
+ {
+ "epoch": 3.94921875,
+ "grad_norm": 0.028530335053801537,
+ "learning_rate": 1.8560368017257229e-06,
+ "loss": 0.0001,
+ "step": 339
+ },
+ {
+ "epoch": 3.9609375,
+ "grad_norm": 0.004602192435413599,
+ "learning_rate": 1.8375462445083464e-06,
+ "loss": 0.0,
+ "step": 340
+ }
+ ],
+ "logging_steps": 1,
+ "max_steps": 510,
+ "num_input_tokens_seen": 0,
+ "num_train_epochs": 6,
+ "save_steps": 85,
+ "stateful_callbacks": {
+ "TrainerControl": {
+ "args": {
+ "should_epoch_stop": false,
+ "should_evaluate": false,
+ "should_log": false,
+ "should_save": true,
+ "should_training_stop": false
+ },
+ "attributes": {}
+ }
+ },
+ "total_flos": 8.538004035049882e+17,
+ "train_batch_size": 4,
+ "trial_name": null,
+ "trial_params": null
+}
diff --git a/checkpoint-340/training_args.bin b/checkpoint-340/training_args.bin
new file mode 100644
index 0000000000000000000000000000000000000000..31435c2b54979c306fa2a089f64bc8d21e1d21cf
--- /dev/null
+++ b/checkpoint-340/training_args.bin
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:ae0e02a237d0ed5071f0d2c656d0cc6fa0293647ec7cffc6f8d299311f592cdc
+size 8056
diff --git a/checkpoint-340/zero_to_fp32.py b/checkpoint-340/zero_to_fp32.py
new file mode 100644
index 0000000000000000000000000000000000000000..24cc342e78d1a006c782b3a4cd68d9ce786d8fd8
--- /dev/null
+++ b/checkpoint-340/zero_to_fp32.py
@@ -0,0 +1,604 @@
+#!/usr/bin/env python
+
+# Copyright (c) Microsoft Corporation.
+# SPDX-License-Identifier: Apache-2.0
+
+# DeepSpeed Team
+
+# This script extracts fp32 consolidated weights from a zero 1, 2 and 3 DeepSpeed checkpoints. It gets
+# copied into the top level checkpoint dir, so the user can easily do the conversion at any point in
+# the future. Once extracted, the weights don't require DeepSpeed and can be used in any
+# application.
+#
+# example: python zero_to_fp32.py . pytorch_model.bin
+
+import argparse
+import torch
+import glob
+import math
+import os
+import re
+from collections import OrderedDict
+from dataclasses import dataclass
+
+# while this script doesn't use deepspeed to recover data, since the checkpoints are pickled with
+# DeepSpeed data structures it has to be available in the current python environment.
+from deepspeed.utils import logger
+from deepspeed.checkpoint.constants import (DS_VERSION, OPTIMIZER_STATE_DICT, SINGLE_PARTITION_OF_FP32_GROUPS,
+ FP32_FLAT_GROUPS, ZERO_STAGE, PARTITION_COUNT, PARAM_SHAPES, BUFFER_NAMES,
+ FROZEN_PARAM_SHAPES, FROZEN_PARAM_FRAGMENTS)
+
+
+@dataclass
+class zero_model_state:
+ buffers: dict()
+ param_shapes: dict()
+ shared_params: list
+ ds_version: int
+ frozen_param_shapes: dict()
+ frozen_param_fragments: dict()
+
+
+debug = 0
+
+# load to cpu
+device = torch.device('cpu')
+
+
+def atoi(text):
+ return int(text) if text.isdigit() else text
+
+
+def natural_keys(text):
+ '''
+ alist.sort(key=natural_keys) sorts in human order
+ http://nedbatchelder.com/blog/200712/human_sorting.html
+ (See Toothy's implementation in the comments)
+ '''
+ return [atoi(c) for c in re.split(r'(\d+)', text)]
+
+
+def get_model_state_file(checkpoint_dir, zero_stage):
+ if not os.path.isdir(checkpoint_dir):
+ raise FileNotFoundError(f"Directory '{checkpoint_dir}' doesn't exist")
+
+ # there should be only one file
+ if zero_stage <= 2:
+ file = os.path.join(checkpoint_dir, "mp_rank_00_model_states.pt")
+ elif zero_stage == 3:
+ file = os.path.join(checkpoint_dir, "zero_pp_rank_0_mp_rank_00_model_states.pt")
+
+ if not os.path.exists(file):
+ raise FileNotFoundError(f"can't find model states file at '{file}'")
+
+ return file
+
+
+def get_checkpoint_files(checkpoint_dir, glob_pattern):
+ # XXX: need to test that this simple glob rule works for multi-node setup too
+ ckpt_files = sorted(glob.glob(os.path.join(checkpoint_dir, glob_pattern)), key=natural_keys)
+
+ if len(ckpt_files) == 0:
+ raise FileNotFoundError(f"can't find {glob_pattern} files in directory '{checkpoint_dir}'")
+
+ return ckpt_files
+
+
+def get_optim_files(checkpoint_dir):
+ return get_checkpoint_files(checkpoint_dir, "*_optim_states.pt")
+
+
+def get_model_state_files(checkpoint_dir):
+ return get_checkpoint_files(checkpoint_dir, "*_model_states.pt")
+
+
+def parse_model_states(files):
+ zero_model_states = []
+ for file in files:
+ state_dict = torch.load(file, map_location=device)
+
+ if BUFFER_NAMES not in state_dict:
+ raise ValueError(f"{file} is not a model state checkpoint")
+ buffer_names = state_dict[BUFFER_NAMES]
+ if debug:
+ print("Found buffers:", buffer_names)
+
+ # recover just the buffers while restoring them to fp32 if they were saved in fp16
+ buffers = {k: v.float() for k, v in state_dict["module"].items() if k in buffer_names}
+ param_shapes = state_dict[PARAM_SHAPES]
+
+ # collect parameters that are included in param_shapes
+ param_names = []
+ for s in param_shapes:
+ for name in s.keys():
+ param_names.append(name)
+
+ # update with frozen parameters
+ frozen_param_shapes = state_dict.get(FROZEN_PARAM_SHAPES, None)
+ if frozen_param_shapes is not None:
+ if debug:
+ print(f"Found frozen_param_shapes: {frozen_param_shapes}")
+ param_names += list(frozen_param_shapes.keys())
+
+ # handle shared params
+ shared_params = [[k, v] for k, v in state_dict["shared_params"].items()]
+
+ ds_version = state_dict.get(DS_VERSION, None)
+
+ frozen_param_fragments = state_dict.get(FROZEN_PARAM_FRAGMENTS, None)
+
+ z_model_state = zero_model_state(buffers=buffers,
+ param_shapes=param_shapes,
+ shared_params=shared_params,
+ ds_version=ds_version,
+ frozen_param_shapes=frozen_param_shapes,
+ frozen_param_fragments=frozen_param_fragments)
+ zero_model_states.append(z_model_state)
+
+ return zero_model_states
+
+
+def parse_optim_states(files, ds_checkpoint_dir):
+
+ total_files = len(files)
+ state_dicts = []
+ for f in files:
+ state_dict = torch.load(f, map_location=device)
+ # immediately discard the potentially huge 2 optimizer states as we only care for fp32 master weights
+ # and also handle the case where it was already removed by another helper script
+ state_dict["optimizer_state_dict"].pop("optimizer_state_dict", None)
+ state_dicts.append(state_dict)
+
+ if not ZERO_STAGE in state_dicts[0][OPTIMIZER_STATE_DICT]:
+ raise ValueError(f"{files[0]} is not a zero checkpoint")
+ zero_stage = state_dicts[0][OPTIMIZER_STATE_DICT][ZERO_STAGE]
+ world_size = state_dicts[0][OPTIMIZER_STATE_DICT][PARTITION_COUNT]
+
+ # For ZeRO-2 each param group can have different partition_count as data parallelism for expert
+ # parameters can be different from data parallelism for non-expert parameters. So we can just
+ # use the max of the partition_count to get the dp world_size.
+
+ if type(world_size) is list:
+ world_size = max(world_size)
+
+ if world_size != total_files:
+ raise ValueError(
+ f"Expected {world_size} of '*_optim_states.pt' under '{ds_checkpoint_dir}' but found {total_files} files. "
+ "Possibly due to an overwrite of an old checkpoint, or a checkpoint didn't get saved by one or more processes."
+ )
+
+ # the groups are named differently in each stage
+ if zero_stage <= 2:
+ fp32_groups_key = SINGLE_PARTITION_OF_FP32_GROUPS
+ elif zero_stage == 3:
+ fp32_groups_key = FP32_FLAT_GROUPS
+ else:
+ raise ValueError(f"unknown zero stage {zero_stage}")
+
+ if zero_stage <= 2:
+ fp32_flat_groups = [state_dicts[i][OPTIMIZER_STATE_DICT][fp32_groups_key] for i in range(len(state_dicts))]
+ elif zero_stage == 3:
+ # if there is more than one param group, there will be multiple flattened tensors - one
+ # flattened tensor per group - for simplicity merge them into a single tensor
+ #
+ # XXX: could make the script more memory efficient for when there are multiple groups - it
+ # will require matching the sub-lists of param_shapes for each param group flattened tensor
+
+ fp32_flat_groups = [
+ torch.cat(state_dicts[i][OPTIMIZER_STATE_DICT][fp32_groups_key], 0) for i in range(len(state_dicts))
+ ]
+
+ return zero_stage, world_size, fp32_flat_groups
+
+
+def _get_fp32_state_dict_from_zero_checkpoint(ds_checkpoint_dir, exclude_frozen_parameters):
+ """
+ Returns fp32 state_dict reconstructed from ds checkpoint
+
+ Args:
+ - ``ds_checkpoint_dir``: path to the deepspeed checkpoint folder (where the optimizer files are)
+
+ """
+ print(f"Processing zero checkpoint '{ds_checkpoint_dir}'")
+
+ optim_files = get_optim_files(ds_checkpoint_dir)
+ zero_stage, world_size, fp32_flat_groups = parse_optim_states(optim_files, ds_checkpoint_dir)
+ print(f"Detected checkpoint of type zero stage {zero_stage}, world_size: {world_size}")
+
+ model_files = get_model_state_files(ds_checkpoint_dir)
+
+ zero_model_states = parse_model_states(model_files)
+ print(f'Parsing checkpoint created by deepspeed=={zero_model_states[0].ds_version}')
+
+ if zero_stage <= 2:
+ return _get_fp32_state_dict_from_zero2_checkpoint(world_size, fp32_flat_groups, zero_model_states,
+ exclude_frozen_parameters)
+ elif zero_stage == 3:
+ return _get_fp32_state_dict_from_zero3_checkpoint(world_size, fp32_flat_groups, zero_model_states,
+ exclude_frozen_parameters)
+
+
+def _zero2_merge_frozen_params(state_dict, zero_model_states):
+ if zero_model_states[0].frozen_param_shapes is None or len(zero_model_states[0].frozen_param_shapes) == 0:
+ return
+
+ frozen_param_shapes = zero_model_states[0].frozen_param_shapes
+ frozen_param_fragments = zero_model_states[0].frozen_param_fragments
+
+ if debug:
+ num_elem = sum(s.numel() for s in frozen_param_shapes.values())
+ print(f'rank 0: {FROZEN_PARAM_SHAPES}.numel = {num_elem}')
+
+ wanted_params = len(frozen_param_shapes)
+ wanted_numel = sum(s.numel() for s in frozen_param_shapes.values())
+ avail_numel = sum([p.numel() for p in frozen_param_fragments.values()])
+ print(f'Frozen params: Have {avail_numel} numels to process.')
+ print(f'Frozen params: Need {wanted_numel} numels in {wanted_params} params')
+
+ total_params = 0
+ total_numel = 0
+ for name, shape in frozen_param_shapes.items():
+ total_params += 1
+ unpartitioned_numel = shape.numel()
+ total_numel += unpartitioned_numel
+
+ state_dict[name] = frozen_param_fragments[name]
+
+ if debug:
+ print(f"{name} full shape: {shape} unpartitioned numel {unpartitioned_numel} ")
+
+ print(f"Reconstructed Frozen fp32 state dict with {total_params} params {total_numel} elements")
+
+
+def _has_callable(obj, fn):
+ attr = getattr(obj, fn, None)
+ return callable(attr)
+
+
+def _zero2_merge_trainable_params(state_dict, world_size, fp32_flat_groups, zero_model_states):
+ param_shapes = zero_model_states[0].param_shapes
+
+ # Reconstruction protocol:
+ #
+ # XXX: document this
+
+ if debug:
+ for i in range(world_size):
+ for j in range(len(fp32_flat_groups[0])):
+ print(f"{FP32_FLAT_GROUPS}[{i}][{j}].shape={fp32_flat_groups[i][j].shape}")
+
+ # XXX: memory usage doubles here (zero2)
+ num_param_groups = len(fp32_flat_groups[0])
+ merged_single_partition_of_fp32_groups = []
+ for i in range(num_param_groups):
+ merged_partitions = [sd[i] for sd in fp32_flat_groups]
+ full_single_fp32_vector = torch.cat(merged_partitions, 0)
+ merged_single_partition_of_fp32_groups.append(full_single_fp32_vector)
+ avail_numel = sum(
+ [full_single_fp32_vector.numel() for full_single_fp32_vector in merged_single_partition_of_fp32_groups])
+
+ if debug:
+ wanted_params = sum([len(shapes) for shapes in param_shapes])
+ wanted_numel = sum([sum(shape.numel() for shape in shapes.values()) for shapes in param_shapes])
+ # not asserting if there is a mismatch due to possible padding
+ print(f"Have {avail_numel} numels to process.")
+ print(f"Need {wanted_numel} numels in {wanted_params} params.")
+
+ # params
+ # XXX: for huge models that can't fit into the host's RAM we will have to recode this to support
+ # out-of-core computing solution
+ total_numel = 0
+ total_params = 0
+ for shapes, full_single_fp32_vector in zip(param_shapes, merged_single_partition_of_fp32_groups):
+ offset = 0
+ avail_numel = full_single_fp32_vector.numel()
+ for name, shape in shapes.items():
+
+ unpartitioned_numel = shape.numel() if _has_callable(shape, 'numel') else math.prod(shape)
+ total_numel += unpartitioned_numel
+ total_params += 1
+
+ if debug:
+ print(f"{name} full shape: {shape} unpartitioned numel {unpartitioned_numel} ")
+ state_dict[name] = full_single_fp32_vector.narrow(0, offset, unpartitioned_numel).view(shape)
+ offset += unpartitioned_numel
+
+ # Z2 started to align to 2*world_size to improve nccl performance. Therefore both offset and
+ # avail_numel can differ by anywhere between 0..2*world_size. Due to two unrelated complex
+ # paddings performed in the code it's almost impossible to predict the exact numbers w/o the
+ # live optimizer object, so we are checking that the numbers are within the right range
+ align_to = 2 * world_size
+
+ def zero2_align(x):
+ return align_to * math.ceil(x / align_to)
+
+ if debug:
+ print(f"original offset={offset}, avail_numel={avail_numel}")
+
+ offset = zero2_align(offset)
+ avail_numel = zero2_align(avail_numel)
+
+ if debug:
+ print(f"aligned offset={offset}, avail_numel={avail_numel}")
+
+ # Sanity check
+ if offset != avail_numel:
+ raise ValueError(f"consumed {offset} numels out of {avail_numel} - something is wrong")
+
+ print(f"Reconstructed fp32 state dict with {total_params} params {total_numel} elements")
+
+
+def _get_fp32_state_dict_from_zero2_checkpoint(world_size, fp32_flat_groups, zero_model_states,
+ exclude_frozen_parameters):
+ state_dict = OrderedDict()
+
+ # buffers
+ buffers = zero_model_states[0].buffers
+ state_dict.update(buffers)
+ if debug:
+ print(f"added {len(buffers)} buffers")
+
+ if not exclude_frozen_parameters:
+ _zero2_merge_frozen_params(state_dict, zero_model_states)
+
+ _zero2_merge_trainable_params(state_dict, world_size, fp32_flat_groups, zero_model_states)
+
+ # recover shared parameters
+ for pair in zero_model_states[0].shared_params:
+ if pair[1] in state_dict:
+ state_dict[pair[0]] = state_dict[pair[1]]
+
+ return state_dict
+
+
+def zero3_partitioned_param_info(unpartitioned_numel, world_size):
+ remainder = unpartitioned_numel % world_size
+ padding_numel = (world_size - remainder) if remainder else 0
+ partitioned_numel = math.ceil(unpartitioned_numel / world_size)
+ return partitioned_numel, padding_numel
+
+
+def _zero3_merge_frozen_params(state_dict, world_size, zero_model_states):
+ if zero_model_states[0].frozen_param_shapes is None or len(zero_model_states[0].frozen_param_shapes) == 0:
+ return
+
+ if debug:
+ for i in range(world_size):
+ num_elem = sum(s.numel() for s in zero_model_states[i].frozen_param_fragments.values())
+ print(f'rank {i}: {FROZEN_PARAM_SHAPES}.numel = {num_elem}')
+
+ frozen_param_shapes = zero_model_states[0].frozen_param_shapes
+ wanted_params = len(frozen_param_shapes)
+ wanted_numel = sum(s.numel() for s in frozen_param_shapes.values())
+ avail_numel = sum([p.numel() for p in zero_model_states[0].frozen_param_fragments.values()]) * world_size
+ print(f'Frozen params: Have {avail_numel} numels to process.')
+ print(f'Frozen params: Need {wanted_numel} numels in {wanted_params} params')
+
+ total_params = 0
+ total_numel = 0
+ for name, shape in zero_model_states[0].frozen_param_shapes.items():
+ total_params += 1
+ unpartitioned_numel = shape.numel()
+ total_numel += unpartitioned_numel
+
+ param_frags = tuple(model_state.frozen_param_fragments[name] for model_state in zero_model_states)
+ state_dict[name] = torch.cat(param_frags, 0).narrow(0, 0, unpartitioned_numel).view(shape)
+
+ partitioned_numel, partitioned_padding_numel = zero3_partitioned_param_info(unpartitioned_numel, world_size)
+
+ if debug:
+ print(
+ f"Frozen params: {total_params} {name} full shape: {shape} partition0 numel={partitioned_numel} partitioned_padding_numel={partitioned_padding_numel}"
+ )
+
+ print(f"Reconstructed Frozen fp32 state dict with {total_params} params {total_numel} elements")
+
+
+def _zero3_merge_trainable_params(state_dict, world_size, fp32_flat_groups, zero_model_states):
+ param_shapes = zero_model_states[0].param_shapes
+ avail_numel = fp32_flat_groups[0].numel() * world_size
+ # Reconstruction protocol: For zero3 we need to zip the partitions together at boundary of each
+ # param, re-consolidating each param, while dealing with padding if any
+
+ # merge list of dicts, preserving order
+ param_shapes = {k: v for d in param_shapes for k, v in d.items()}
+
+ if debug:
+ for i in range(world_size):
+ print(f"{FP32_FLAT_GROUPS}[{i}].shape={fp32_flat_groups[i].shape}")
+
+ wanted_params = len(param_shapes)
+ wanted_numel = sum(shape.numel() for shape in param_shapes.values())
+ # not asserting if there is a mismatch due to possible padding
+ avail_numel = fp32_flat_groups[0].numel() * world_size
+ print(f"Trainable params: Have {avail_numel} numels to process.")
+ print(f"Trainable params: Need {wanted_numel} numels in {wanted_params} params.")
+
+ # params
+ # XXX: for huge models that can't fit into the host's RAM we will have to recode this to support
+ # out-of-core computing solution
+ offset = 0
+ total_numel = 0
+ total_params = 0
+ for name, shape in param_shapes.items():
+
+ unpartitioned_numel = shape.numel()
+ total_numel += unpartitioned_numel
+ total_params += 1
+
+ partitioned_numel, partitioned_padding_numel = zero3_partitioned_param_info(unpartitioned_numel, world_size)
+
+ if debug:
+ print(
+ f"Trainable params: {total_params} {name} full shape: {shape} partition0 numel={partitioned_numel} partitioned_padding_numel={partitioned_padding_numel}"
+ )
+
+ # XXX: memory usage doubles here
+ state_dict[name] = torch.cat(
+ tuple(fp32_flat_groups[i].narrow(0, offset, partitioned_numel) for i in range(world_size)),
+ 0).narrow(0, 0, unpartitioned_numel).view(shape)
+ offset += partitioned_numel
+
+ offset *= world_size
+
+ # Sanity check
+ if offset != avail_numel:
+ raise ValueError(f"consumed {offset} numels out of {avail_numel} - something is wrong")
+
+ print(f"Reconstructed Trainable fp32 state dict with {total_params} params {total_numel} elements")
+
+
+def _get_fp32_state_dict_from_zero3_checkpoint(world_size, fp32_flat_groups, zero_model_states,
+ exclude_frozen_parameters):
+ state_dict = OrderedDict()
+
+ # buffers
+ buffers = zero_model_states[0].buffers
+ state_dict.update(buffers)
+ if debug:
+ print(f"added {len(buffers)} buffers")
+
+ if not exclude_frozen_parameters:
+ _zero3_merge_frozen_params(state_dict, world_size, zero_model_states)
+
+ _zero3_merge_trainable_params(state_dict, world_size, fp32_flat_groups, zero_model_states)
+
+ # recover shared parameters
+ for pair in zero_model_states[0].shared_params:
+ if pair[1] in state_dict:
+ state_dict[pair[0]] = state_dict[pair[1]]
+
+ return state_dict
+
+
+def get_fp32_state_dict_from_zero_checkpoint(checkpoint_dir, tag=None, exclude_frozen_parameters=False):
+ """
+ Convert ZeRO 2 or 3 checkpoint into a single fp32 consolidated state_dict that can be loaded with
+ ``load_state_dict()`` and used for training without DeepSpeed or shared with others, for example
+ via a model hub.
+
+ Args:
+ - ``checkpoint_dir``: path to the desired checkpoint folder
+ - ``tag``: checkpoint tag used as a unique identifier for checkpoint. If not provided will attempt to load tag in 'latest' file. e.g., ``global_step14``
+ - ``exclude_frozen_parameters``: exclude frozen parameters
+
+ Returns:
+ - pytorch ``state_dict``
+
+ Note: this approach may not work if your application doesn't have sufficient free CPU memory and
+ you may need to use the offline approach using the ``zero_to_fp32.py`` script that is saved with
+ the checkpoint.
+
+ A typical usage might be ::
+
+ from deepspeed.utils.zero_to_fp32 import get_fp32_state_dict_from_zero_checkpoint
+ # do the training and checkpoint saving
+ state_dict = get_fp32_state_dict_from_zero_checkpoint(checkpoint_dir) # already on cpu
+ model = model.cpu() # move to cpu
+ model.load_state_dict(state_dict)
+ # submit to model hub or save the model to share with others
+
+ In this example the ``model`` will no longer be usable in the deepspeed context of the same
+ application. i.e. you will need to re-initialize the deepspeed engine, since
+ ``model.load_state_dict(state_dict)`` will remove all the deepspeed magic from it.
+
+ If you want it all done for you, use ``load_state_dict_from_zero_checkpoint`` instead.
+
+ """
+ if tag is None:
+ latest_path = os.path.join(checkpoint_dir, 'latest')
+ if os.path.isfile(latest_path):
+ with open(latest_path, 'r') as fd:
+ tag = fd.read().strip()
+ else:
+ raise ValueError(f"Unable to find 'latest' file at {latest_path}")
+
+ ds_checkpoint_dir = os.path.join(checkpoint_dir, tag)
+
+ if not os.path.isdir(ds_checkpoint_dir):
+ raise FileNotFoundError(f"Directory '{ds_checkpoint_dir}' doesn't exist")
+
+ return _get_fp32_state_dict_from_zero_checkpoint(ds_checkpoint_dir, exclude_frozen_parameters)
+
+
+def convert_zero_checkpoint_to_fp32_state_dict(checkpoint_dir, output_file, tag=None, exclude_frozen_parameters=False):
+ """
+ Convert ZeRO 2 or 3 checkpoint into a single fp32 consolidated ``state_dict`` file that can be
+ loaded with ``torch.load(file)`` + ``load_state_dict()`` and used for training without DeepSpeed.
+
+ Args:
+ - ``checkpoint_dir``: path to the desired checkpoint folder. (one that contains the tag-folder, like ``global_step14``)
+ - ``output_file``: path to the pytorch fp32 state_dict output file (e.g. path/pytorch_model.bin)
+ - ``tag``: checkpoint tag used as a unique identifier for checkpoint. If not provided will attempt to load tag in the file named ``latest`` in the checkpoint folder, e.g., ``global_step14``
+ - ``exclude_frozen_parameters``: exclude frozen parameters
+ """
+
+ state_dict = get_fp32_state_dict_from_zero_checkpoint(checkpoint_dir, tag, exclude_frozen_parameters)
+ print(f"Saving fp32 state dict to {output_file}")
+ torch.save(state_dict, output_file)
+
+
+def load_state_dict_from_zero_checkpoint(model, checkpoint_dir, tag=None):
+ """
+ 1. Put the provided model to cpu
+ 2. Convert ZeRO 2 or 3 checkpoint into a single fp32 consolidated ``state_dict``
+ 3. Load it into the provided model
+
+ Args:
+ - ``model``: the model object to update
+ - ``checkpoint_dir``: path to the desired checkpoint folder. (one that contains the tag-folder, like ``global_step14``)
+ - ``tag``: checkpoint tag used as a unique identifier for checkpoint. If not provided will attempt to load tag in the file named ``latest`` in the checkpoint folder, e.g., ``global_step14``
+
+ Returns:
+ - ``model`: modified model
+
+ Make sure you have plenty of CPU memory available before you call this function. If you don't
+ have enough use the ``zero_to_fp32.py`` utility to do the conversion. You will find it
+ conveniently placed for you in the checkpoint folder.
+
+ A typical usage might be ::
+
+ from deepspeed.utils.zero_to_fp32 import load_state_dict_from_zero_checkpoint
+ model = load_state_dict_from_zero_checkpoint(trainer.model, checkpoint_dir)
+ # submit to model hub or save the model to share with others
+
+ Note, that once this was run, the ``model`` will no longer be usable in the deepspeed context
+ of the same application. i.e. you will need to re-initialize the deepspeed engine, since
+ ``model.load_state_dict(state_dict)`` will remove all the deepspeed magic from it.
+
+ """
+ logger.info(f"Extracting fp32 weights")
+ state_dict = get_fp32_state_dict_from_zero_checkpoint(checkpoint_dir, tag)
+
+ logger.info(f"Overwriting model with fp32 weights")
+ model = model.cpu()
+ model.load_state_dict(state_dict, strict=False)
+
+ return model
+
+
+if __name__ == "__main__":
+
+ parser = argparse.ArgumentParser()
+ parser.add_argument("checkpoint_dir",
+ type=str,
+ help="path to the desired checkpoint folder, e.g., path/checkpoint-12")
+ parser.add_argument(
+ "output_file",
+ type=str,
+ help="path to the pytorch fp32 state_dict output file (e.g. path/checkpoint-12/pytorch_model.bin)")
+ parser.add_argument("-t",
+ "--tag",
+ type=str,
+ default=None,
+ help="checkpoint tag used as a unique identifier for checkpoint. e.g., global_step1")
+ parser.add_argument("--exclude_frozen_parameters", action='store_true', help="exclude frozen parameters")
+ parser.add_argument("-d", "--debug", action='store_true', help="enable debug")
+ args = parser.parse_args()
+
+ debug = args.debug
+
+ convert_zero_checkpoint_to_fp32_state_dict(args.checkpoint_dir,
+ args.output_file,
+ tag=args.tag,
+ exclude_frozen_parameters=args.exclude_frozen_parameters)
diff --git a/checkpoint-425/README.md b/checkpoint-425/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..be5c87703f12b547886cc6a2ecfbe9ee150496fa
--- /dev/null
+++ b/checkpoint-425/README.md
@@ -0,0 +1,202 @@
+---
+base_model: meta-llama/Llama-3.1-8B-Instruct
+library_name: peft
+---
+
+# Model Card for Model ID
+
+
+
+
+
+## Model Details
+
+### Model Description
+
+
+
+
+
+- **Developed by:** [More Information Needed]
+- **Funded by [optional]:** [More Information Needed]
+- **Shared by [optional]:** [More Information Needed]
+- **Model type:** [More Information Needed]
+- **Language(s) (NLP):** [More Information Needed]
+- **License:** [More Information Needed]
+- **Finetuned from model [optional]:** [More Information Needed]
+
+### Model Sources [optional]
+
+
+
+- **Repository:** [More Information Needed]
+- **Paper [optional]:** [More Information Needed]
+- **Demo [optional]:** [More Information Needed]
+
+## Uses
+
+
+
+### Direct Use
+
+
+
+[More Information Needed]
+
+### Downstream Use [optional]
+
+
+
+[More Information Needed]
+
+### Out-of-Scope Use
+
+
+
+[More Information Needed]
+
+## Bias, Risks, and Limitations
+
+
+
+[More Information Needed]
+
+### Recommendations
+
+
+
+Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
+
+## How to Get Started with the Model
+
+Use the code below to get started with the model.
+
+[More Information Needed]
+
+## Training Details
+
+### Training Data
+
+
+
+[More Information Needed]
+
+### Training Procedure
+
+
+
+#### Preprocessing [optional]
+
+[More Information Needed]
+
+
+#### Training Hyperparameters
+
+- **Training regime:** [More Information Needed]
+
+#### Speeds, Sizes, Times [optional]
+
+
+
+[More Information Needed]
+
+## Evaluation
+
+
+
+### Testing Data, Factors & Metrics
+
+#### Testing Data
+
+
+
+[More Information Needed]
+
+#### Factors
+
+
+
+[More Information Needed]
+
+#### Metrics
+
+
+
+[More Information Needed]
+
+### Results
+
+[More Information Needed]
+
+#### Summary
+
+
+
+## Model Examination [optional]
+
+
+
+[More Information Needed]
+
+## Environmental Impact
+
+
+
+Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
+
+- **Hardware Type:** [More Information Needed]
+- **Hours used:** [More Information Needed]
+- **Cloud Provider:** [More Information Needed]
+- **Compute Region:** [More Information Needed]
+- **Carbon Emitted:** [More Information Needed]
+
+## Technical Specifications [optional]
+
+### Model Architecture and Objective
+
+[More Information Needed]
+
+### Compute Infrastructure
+
+[More Information Needed]
+
+#### Hardware
+
+[More Information Needed]
+
+#### Software
+
+[More Information Needed]
+
+## Citation [optional]
+
+
+
+**BibTeX:**
+
+[More Information Needed]
+
+**APA:**
+
+[More Information Needed]
+
+## Glossary [optional]
+
+
+
+[More Information Needed]
+
+## More Information [optional]
+
+[More Information Needed]
+
+## Model Card Authors [optional]
+
+[More Information Needed]
+
+## Model Card Contact
+
+[More Information Needed]
+### Framework versions
+
+- PEFT 0.14.0
\ No newline at end of file
diff --git a/checkpoint-425/adapter_config.json b/checkpoint-425/adapter_config.json
new file mode 100644
index 0000000000000000000000000000000000000000..9dfb3ab60881d002c4cdbcc157a93958018fe683
--- /dev/null
+++ b/checkpoint-425/adapter_config.json
@@ -0,0 +1,40 @@
+{
+ "alpha_pattern": {},
+ "auto_mapping": null,
+ "base_model_name_or_path": "meta-llama/Llama-3.1-8B-Instruct",
+ "bias": "none",
+ "eva_config": null,
+ "exclude_modules": null,
+ "fan_in_fan_out": null,
+ "inference_mode": true,
+ "init_lora_weights": true,
+ "layer_replication": null,
+ "layers_pattern": null,
+ "layers_to_transform": null,
+ "loftq_config": {},
+ "lora_alpha": 512,
+ "lora_bias": false,
+ "lora_dropout": 0.05,
+ "megatron_config": null,
+ "megatron_core": "megatron.core",
+ "modules_to_save": [
+ "embed_tokens",
+ "lm_head"
+ ],
+ "peft_type": "LORA",
+ "r": 256,
+ "rank_pattern": {},
+ "revision": null,
+ "target_modules": [
+ "v_proj",
+ "up_proj",
+ "q_proj",
+ "o_proj",
+ "down_proj",
+ "gate_proj",
+ "k_proj"
+ ],
+ "task_type": "CAUSAL_LM",
+ "use_dora": false,
+ "use_rslora": false
+}
\ No newline at end of file
diff --git a/checkpoint-425/adapter_model.safetensors b/checkpoint-425/adapter_model.safetensors
new file mode 100644
index 0000000000000000000000000000000000000000..3f3e72ee2bebcc69ae60d4c696f31cf5a408e203
--- /dev/null
+++ b/checkpoint-425/adapter_model.safetensors
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:bc830f7c14f53a0d40ad5d6ade9fdc7fcfc4199b30dc5260e4f8c50d73adc94b
+size 3443586272
diff --git a/checkpoint-425/global_step422/bf16_zero_pp_rank_0_mp_rank_00_optim_states.pt b/checkpoint-425/global_step422/bf16_zero_pp_rank_0_mp_rank_00_optim_states.pt
new file mode 100644
index 0000000000000000000000000000000000000000..ecde8714f71ad18a81d08e25f69b8ec7a462962d
--- /dev/null
+++ b/checkpoint-425/global_step422/bf16_zero_pp_rank_0_mp_rank_00_optim_states.pt
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:28fd341bc85d843ba6a639283a098531bc46b01608c6cebc412c8820d723cf9c
+size 20661195036
diff --git a/checkpoint-425/global_step422/mp_rank_00_model_states.pt b/checkpoint-425/global_step422/mp_rank_00_model_states.pt
new file mode 100644
index 0000000000000000000000000000000000000000..6e1717f41445f031e5f1b52e94459e3a3864456d
--- /dev/null
+++ b/checkpoint-425/global_step422/mp_rank_00_model_states.pt
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:30d70f57fe0593cb12bfa3792eae9df01ceffa6f4bfa303318e0f144dc923ee7
+size 3555326649
diff --git a/checkpoint-425/latest b/checkpoint-425/latest
new file mode 100644
index 0000000000000000000000000000000000000000..531eb19c1567f8fb524ba7602fa82177271b58d5
--- /dev/null
+++ b/checkpoint-425/latest
@@ -0,0 +1 @@
+global_step422
\ No newline at end of file
diff --git a/checkpoint-425/rng_state.pth b/checkpoint-425/rng_state.pth
new file mode 100644
index 0000000000000000000000000000000000000000..e614be43f9c1db43e730670abf1ec50896fabbe9
--- /dev/null
+++ b/checkpoint-425/rng_state.pth
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:1424f4c695715f74affeb7d9cb20993374868e963cc5cfc8c77c7ef7fa234853
+size 14244
diff --git a/checkpoint-425/scheduler.pt b/checkpoint-425/scheduler.pt
new file mode 100644
index 0000000000000000000000000000000000000000..9a74790601b3ba3477ccd42ff9565964cb71065c
--- /dev/null
+++ b/checkpoint-425/scheduler.pt
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:9687bebd8a0be11abb8ced29bbecfebd991cc87189239cf1779a88295b6e131d
+size 1064
diff --git a/checkpoint-425/special_tokens_map.json b/checkpoint-425/special_tokens_map.json
new file mode 100644
index 0000000000000000000000000000000000000000..278b7f0f84be865c4687700ee7b3c63d89a51e18
--- /dev/null
+++ b/checkpoint-425/special_tokens_map.json
@@ -0,0 +1,23 @@
+{
+ "bos_token": {
+ "content": "<|begin_of_text|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false
+ },
+ "eos_token": {
+ "content": "<|eot_id|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false
+ },
+ "pad_token": {
+ "content": "<|end_of_text|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false
+ }
+}
diff --git a/checkpoint-425/tokenizer.json b/checkpoint-425/tokenizer.json
new file mode 100644
index 0000000000000000000000000000000000000000..1c1d8d5c9024994f1d3b00f9662b8dd89ca13cf2
--- /dev/null
+++ b/checkpoint-425/tokenizer.json
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:6b9e4e7fb171f92fd137b777cc2714bf87d11576700a1dcd7a399e7bbe39537b
+size 17209920
diff --git a/checkpoint-425/tokenizer_config.json b/checkpoint-425/tokenizer_config.json
new file mode 100644
index 0000000000000000000000000000000000000000..ca91a2ef55f4239a7af81d7c9abb05f53621a07b
--- /dev/null
+++ b/checkpoint-425/tokenizer_config.json
@@ -0,0 +1,2064 @@
+{
+ "added_tokens_decoder": {
+ "128000": {
+ "content": "<|begin_of_text|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128001": {
+ "content": "<|end_of_text|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128002": {
+ "content": "<|reserved_special_token_0|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128003": {
+ "content": "<|reserved_special_token_1|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128004": {
+ "content": "<|finetune_right_pad_id|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128005": {
+ "content": "<|reserved_special_token_2|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128006": {
+ "content": "<|start_header_id|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128007": {
+ "content": "<|end_header_id|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128008": {
+ "content": "<|eom_id|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128009": {
+ "content": "<|eot_id|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128010": {
+ "content": "<|python_tag|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128011": {
+ "content": "<|reserved_special_token_3|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128012": {
+ "content": "<|reserved_special_token_4|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128013": {
+ "content": "<|reserved_special_token_5|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128014": {
+ "content": "<|reserved_special_token_6|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128015": {
+ "content": "<|reserved_special_token_7|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128016": {
+ "content": "<|reserved_special_token_8|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128017": {
+ "content": "<|reserved_special_token_9|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128018": {
+ "content": "<|reserved_special_token_10|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128019": {
+ "content": "<|reserved_special_token_11|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128020": {
+ "content": "<|reserved_special_token_12|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128021": {
+ "content": "<|reserved_special_token_13|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128022": {
+ "content": "<|reserved_special_token_14|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128023": {
+ "content": "<|reserved_special_token_15|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128024": {
+ "content": "<|reserved_special_token_16|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128025": {
+ "content": "<|reserved_special_token_17|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128026": {
+ "content": "<|reserved_special_token_18|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128027": {
+ "content": "<|reserved_special_token_19|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128028": {
+ "content": "<|reserved_special_token_20|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128029": {
+ "content": "<|reserved_special_token_21|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128030": {
+ "content": "<|reserved_special_token_22|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128031": {
+ "content": "<|reserved_special_token_23|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128032": {
+ "content": "<|reserved_special_token_24|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128033": {
+ "content": "<|reserved_special_token_25|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128034": {
+ "content": "<|reserved_special_token_26|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128035": {
+ "content": "<|reserved_special_token_27|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128036": {
+ "content": "<|reserved_special_token_28|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128037": {
+ "content": "<|reserved_special_token_29|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128038": {
+ "content": "<|reserved_special_token_30|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128039": {
+ "content": "<|reserved_special_token_31|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128040": {
+ "content": "<|reserved_special_token_32|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128041": {
+ "content": "<|reserved_special_token_33|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128042": {
+ "content": "<|reserved_special_token_34|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128043": {
+ "content": "<|reserved_special_token_35|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128044": {
+ "content": "<|reserved_special_token_36|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128045": {
+ "content": "<|reserved_special_token_37|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128046": {
+ "content": "<|reserved_special_token_38|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128047": {
+ "content": "<|reserved_special_token_39|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128048": {
+ "content": "<|reserved_special_token_40|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128049": {
+ "content": "<|reserved_special_token_41|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128050": {
+ "content": "<|reserved_special_token_42|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128051": {
+ "content": "<|reserved_special_token_43|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128052": {
+ "content": "<|reserved_special_token_44|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128053": {
+ "content": "<|reserved_special_token_45|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128054": {
+ "content": "<|reserved_special_token_46|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128055": {
+ "content": "<|reserved_special_token_47|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128056": {
+ "content": "<|reserved_special_token_48|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128057": {
+ "content": "<|reserved_special_token_49|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128058": {
+ "content": "<|reserved_special_token_50|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128059": {
+ "content": "<|reserved_special_token_51|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128060": {
+ "content": "<|reserved_special_token_52|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128061": {
+ "content": "<|reserved_special_token_53|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128062": {
+ "content": "<|reserved_special_token_54|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128063": {
+ "content": "<|reserved_special_token_55|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128064": {
+ "content": "<|reserved_special_token_56|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128065": {
+ "content": "<|reserved_special_token_57|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128066": {
+ "content": "<|reserved_special_token_58|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128067": {
+ "content": "<|reserved_special_token_59|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128068": {
+ "content": "<|reserved_special_token_60|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128069": {
+ "content": "<|reserved_special_token_61|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128070": {
+ "content": "<|reserved_special_token_62|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128071": {
+ "content": "<|reserved_special_token_63|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128072": {
+ "content": "<|reserved_special_token_64|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128073": {
+ "content": "<|reserved_special_token_65|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128074": {
+ "content": "<|reserved_special_token_66|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128075": {
+ "content": "<|reserved_special_token_67|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128076": {
+ "content": "<|reserved_special_token_68|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128077": {
+ "content": "<|reserved_special_token_69|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128078": {
+ "content": "<|reserved_special_token_70|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128079": {
+ "content": "<|reserved_special_token_71|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128080": {
+ "content": "<|reserved_special_token_72|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128081": {
+ "content": "<|reserved_special_token_73|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128082": {
+ "content": "<|reserved_special_token_74|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128083": {
+ "content": "<|reserved_special_token_75|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128084": {
+ "content": "<|reserved_special_token_76|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128085": {
+ "content": "<|reserved_special_token_77|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128086": {
+ "content": "<|reserved_special_token_78|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128087": {
+ "content": "<|reserved_special_token_79|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128088": {
+ "content": "<|reserved_special_token_80|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128089": {
+ "content": "<|reserved_special_token_81|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128090": {
+ "content": "<|reserved_special_token_82|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128091": {
+ "content": "<|reserved_special_token_83|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128092": {
+ "content": "<|reserved_special_token_84|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128093": {
+ "content": "<|reserved_special_token_85|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128094": {
+ "content": "<|reserved_special_token_86|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128095": {
+ "content": "<|reserved_special_token_87|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128096": {
+ "content": "<|reserved_special_token_88|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128097": {
+ "content": "<|reserved_special_token_89|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128098": {
+ "content": "<|reserved_special_token_90|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128099": {
+ "content": "<|reserved_special_token_91|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128100": {
+ "content": "<|reserved_special_token_92|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128101": {
+ "content": "<|reserved_special_token_93|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128102": {
+ "content": "<|reserved_special_token_94|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128103": {
+ "content": "<|reserved_special_token_95|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128104": {
+ "content": "<|reserved_special_token_96|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128105": {
+ "content": "<|reserved_special_token_97|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128106": {
+ "content": "<|reserved_special_token_98|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128107": {
+ "content": "<|reserved_special_token_99|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128108": {
+ "content": "<|reserved_special_token_100|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128109": {
+ "content": "<|reserved_special_token_101|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128110": {
+ "content": "<|reserved_special_token_102|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128111": {
+ "content": "<|reserved_special_token_103|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128112": {
+ "content": "<|reserved_special_token_104|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128113": {
+ "content": "<|reserved_special_token_105|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128114": {
+ "content": "<|reserved_special_token_106|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128115": {
+ "content": "<|reserved_special_token_107|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128116": {
+ "content": "<|reserved_special_token_108|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128117": {
+ "content": "<|reserved_special_token_109|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128118": {
+ "content": "<|reserved_special_token_110|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128119": {
+ "content": "<|reserved_special_token_111|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128120": {
+ "content": "<|reserved_special_token_112|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128121": {
+ "content": "<|reserved_special_token_113|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128122": {
+ "content": "<|reserved_special_token_114|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128123": {
+ "content": "<|reserved_special_token_115|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128124": {
+ "content": "<|reserved_special_token_116|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128125": {
+ "content": "<|reserved_special_token_117|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128126": {
+ "content": "<|reserved_special_token_118|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128127": {
+ "content": "<|reserved_special_token_119|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128128": {
+ "content": "<|reserved_special_token_120|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128129": {
+ "content": "<|reserved_special_token_121|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128130": {
+ "content": "<|reserved_special_token_122|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128131": {
+ "content": "<|reserved_special_token_123|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128132": {
+ "content": "<|reserved_special_token_124|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128133": {
+ "content": "<|reserved_special_token_125|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128134": {
+ "content": "<|reserved_special_token_126|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128135": {
+ "content": "<|reserved_special_token_127|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128136": {
+ "content": "<|reserved_special_token_128|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128137": {
+ "content": "<|reserved_special_token_129|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128138": {
+ "content": "<|reserved_special_token_130|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128139": {
+ "content": "<|reserved_special_token_131|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128140": {
+ "content": "<|reserved_special_token_132|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128141": {
+ "content": "<|reserved_special_token_133|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128142": {
+ "content": "<|reserved_special_token_134|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128143": {
+ "content": "<|reserved_special_token_135|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128144": {
+ "content": "<|reserved_special_token_136|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128145": {
+ "content": "<|reserved_special_token_137|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128146": {
+ "content": "<|reserved_special_token_138|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128147": {
+ "content": "<|reserved_special_token_139|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128148": {
+ "content": "<|reserved_special_token_140|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128149": {
+ "content": "<|reserved_special_token_141|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128150": {
+ "content": "<|reserved_special_token_142|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128151": {
+ "content": "<|reserved_special_token_143|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128152": {
+ "content": "<|reserved_special_token_144|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128153": {
+ "content": "<|reserved_special_token_145|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128154": {
+ "content": "<|reserved_special_token_146|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128155": {
+ "content": "<|reserved_special_token_147|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128156": {
+ "content": "<|reserved_special_token_148|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128157": {
+ "content": "<|reserved_special_token_149|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128158": {
+ "content": "<|reserved_special_token_150|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128159": {
+ "content": "<|reserved_special_token_151|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128160": {
+ "content": "<|reserved_special_token_152|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128161": {
+ "content": "<|reserved_special_token_153|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128162": {
+ "content": "<|reserved_special_token_154|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128163": {
+ "content": "<|reserved_special_token_155|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128164": {
+ "content": "<|reserved_special_token_156|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128165": {
+ "content": "<|reserved_special_token_157|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128166": {
+ "content": "<|reserved_special_token_158|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128167": {
+ "content": "<|reserved_special_token_159|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128168": {
+ "content": "<|reserved_special_token_160|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128169": {
+ "content": "<|reserved_special_token_161|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128170": {
+ "content": "<|reserved_special_token_162|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128171": {
+ "content": "<|reserved_special_token_163|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128172": {
+ "content": "<|reserved_special_token_164|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128173": {
+ "content": "<|reserved_special_token_165|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128174": {
+ "content": "<|reserved_special_token_166|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128175": {
+ "content": "<|reserved_special_token_167|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128176": {
+ "content": "<|reserved_special_token_168|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128177": {
+ "content": "<|reserved_special_token_169|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128178": {
+ "content": "<|reserved_special_token_170|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128179": {
+ "content": "<|reserved_special_token_171|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128180": {
+ "content": "<|reserved_special_token_172|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128181": {
+ "content": "<|reserved_special_token_173|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128182": {
+ "content": "<|reserved_special_token_174|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128183": {
+ "content": "<|reserved_special_token_175|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128184": {
+ "content": "<|reserved_special_token_176|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128185": {
+ "content": "<|reserved_special_token_177|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128186": {
+ "content": "<|reserved_special_token_178|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128187": {
+ "content": "<|reserved_special_token_179|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128188": {
+ "content": "<|reserved_special_token_180|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128189": {
+ "content": "<|reserved_special_token_181|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128190": {
+ "content": "<|reserved_special_token_182|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128191": {
+ "content": "<|reserved_special_token_183|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128192": {
+ "content": "<|reserved_special_token_184|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128193": {
+ "content": "<|reserved_special_token_185|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128194": {
+ "content": "<|reserved_special_token_186|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128195": {
+ "content": "<|reserved_special_token_187|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128196": {
+ "content": "<|reserved_special_token_188|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128197": {
+ "content": "<|reserved_special_token_189|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128198": {
+ "content": "<|reserved_special_token_190|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128199": {
+ "content": "<|reserved_special_token_191|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128200": {
+ "content": "<|reserved_special_token_192|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128201": {
+ "content": "<|reserved_special_token_193|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128202": {
+ "content": "<|reserved_special_token_194|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128203": {
+ "content": "<|reserved_special_token_195|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128204": {
+ "content": "<|reserved_special_token_196|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128205": {
+ "content": "<|reserved_special_token_197|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128206": {
+ "content": "<|reserved_special_token_198|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128207": {
+ "content": "<|reserved_special_token_199|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128208": {
+ "content": "<|reserved_special_token_200|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128209": {
+ "content": "<|reserved_special_token_201|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128210": {
+ "content": "<|reserved_special_token_202|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128211": {
+ "content": "<|reserved_special_token_203|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128212": {
+ "content": "<|reserved_special_token_204|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128213": {
+ "content": "<|reserved_special_token_205|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128214": {
+ "content": "<|reserved_special_token_206|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128215": {
+ "content": "<|reserved_special_token_207|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128216": {
+ "content": "<|reserved_special_token_208|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128217": {
+ "content": "<|reserved_special_token_209|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128218": {
+ "content": "<|reserved_special_token_210|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128219": {
+ "content": "<|reserved_special_token_211|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128220": {
+ "content": "<|reserved_special_token_212|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128221": {
+ "content": "<|reserved_special_token_213|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128222": {
+ "content": "<|reserved_special_token_214|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128223": {
+ "content": "<|reserved_special_token_215|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128224": {
+ "content": "<|reserved_special_token_216|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128225": {
+ "content": "<|reserved_special_token_217|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128226": {
+ "content": "<|reserved_special_token_218|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128227": {
+ "content": "<|reserved_special_token_219|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128228": {
+ "content": "<|reserved_special_token_220|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128229": {
+ "content": "<|reserved_special_token_221|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128230": {
+ "content": "<|reserved_special_token_222|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128231": {
+ "content": "<|reserved_special_token_223|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128232": {
+ "content": "<|reserved_special_token_224|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128233": {
+ "content": "<|reserved_special_token_225|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128234": {
+ "content": "<|reserved_special_token_226|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128235": {
+ "content": "<|reserved_special_token_227|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128236": {
+ "content": "<|reserved_special_token_228|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128237": {
+ "content": "<|reserved_special_token_229|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128238": {
+ "content": "<|reserved_special_token_230|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128239": {
+ "content": "<|reserved_special_token_231|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128240": {
+ "content": "<|reserved_special_token_232|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128241": {
+ "content": "<|reserved_special_token_233|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128242": {
+ "content": "<|reserved_special_token_234|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128243": {
+ "content": "<|reserved_special_token_235|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128244": {
+ "content": "<|reserved_special_token_236|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128245": {
+ "content": "<|reserved_special_token_237|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128246": {
+ "content": "<|reserved_special_token_238|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128247": {
+ "content": "<|reserved_special_token_239|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128248": {
+ "content": "<|reserved_special_token_240|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128249": {
+ "content": "<|reserved_special_token_241|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128250": {
+ "content": "<|reserved_special_token_242|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128251": {
+ "content": "<|reserved_special_token_243|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128252": {
+ "content": "<|reserved_special_token_244|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128253": {
+ "content": "<|reserved_special_token_245|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128254": {
+ "content": "<|reserved_special_token_246|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128255": {
+ "content": "<|reserved_special_token_247|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ }
+ },
+ "bos_token": "<|begin_of_text|>",
+ "chat_template": "{{- bos_token }}\n{%- if custom_tools is defined %}\n {%- set tools = custom_tools %}\n{%- endif %}\n{%- if not tools_in_user_message is defined %}\n {%- set tools_in_user_message = true %}\n{%- endif %}\n{%- if not date_string is defined %}\n {%- set date_string = \"26 Jul 2024\" %}\n{%- endif %}\n{%- if not tools is defined %}\n {%- set tools = none %}\n{%- endif %}\n\n{#- This block extracts the system message, so we can slot it into the right place. #}\n{%- if messages[0]['role'] == 'system' %}\n {%- set system_message = messages[0]['content']|trim %}\n {%- set messages = messages[1:] %}\n{%- else %}\n {%- set system_message = \"\" %}\n{%- endif %}\n\n{#- System message + builtin tools #}\n{{- \"<|start_header_id|>system<|end_header_id|>\\n\\n\" }}\n{%- if builtin_tools is defined or tools is not none %}\n {{- \"Environment: ipython\\n\" }}\n{%- endif %}\n{%- if builtin_tools is defined %}\n {{- \"Tools: \" + builtin_tools | reject('equalto', 'code_interpreter') | join(\", \") + \"\\n\\n\"}}\n{%- endif %}\n{{- \"Cutting Knowledge Date: December 2023\\n\" }}\n{{- \"Today Date: \" + date_string + \"\\n\\n\" }}\n{%- if tools is not none and not tools_in_user_message %}\n {{- \"You have access to the following functions. To call a function, please respond with JSON for a function call.\" }}\n {{- 'Respond in the format {\"name\": function name, \"parameters\": dictionary of argument name and its value}.' }}\n {{- \"Do not use variables.\\n\\n\" }}\n {%- for t in tools %}\n {{- t | tojson(indent=4) }}\n {{- \"\\n\\n\" }}\n {%- endfor %}\n{%- endif %}\n{{- system_message }}\n{{- \"<|eot_id|>\" }}\n\n{#- Custom tools are passed in a user message with some extra guidance #}\n{%- if tools_in_user_message and not tools is none %}\n {#- Extract the first user message so we can plug it in here #}\n {%- if messages | length != 0 %}\n {%- set first_user_message = messages[0]['content']|trim %}\n {%- set messages = messages[1:] %}\n {%- else %}\n {{- raise_exception(\"Cannot put tools in the first user message when there's no first user message!\") }}\n{%- endif %}\n {{- '<|start_header_id|>user<|end_header_id|>\\n\\n' -}}\n {{- \"Given the following functions, please respond with a JSON for a function call \" }}\n {{- \"with its proper arguments that best answers the given prompt.\\n\\n\" }}\n {{- 'Respond in the format {\"name\": function name, \"parameters\": dictionary of argument name and its value}.' }}\n {{- \"Do not use variables.\\n\\n\" }}\n {%- for t in tools %}\n {{- t | tojson(indent=4) }}\n {{- \"\\n\\n\" }}\n {%- endfor %}\n {{- first_user_message + \"<|eot_id|>\"}}\n{%- endif %}\n\n{%- for message in messages %}\n {%- if not (message.role == 'ipython' or message.role == 'tool' or 'tool_calls' in message) %}\n {{- '<|start_header_id|>' + message['role'] + '<|end_header_id|>\\n\\n'+ message['content'] | trim + '<|eot_id|>' }}\n {%- elif 'tool_calls' in message %}\n {%- if not message.tool_calls|length == 1 %}\n {{- raise_exception(\"This model only supports single tool-calls at once!\") }}\n {%- endif %}\n {%- set tool_call = message.tool_calls[0].function %}\n {%- if builtin_tools is defined and tool_call.name in builtin_tools %}\n {{- '<|start_header_id|>assistant<|end_header_id|>\\n\\n' -}}\n {{- \"<|python_tag|>\" + tool_call.name + \".call(\" }}\n {%- for arg_name, arg_val in tool_call.arguments | items %}\n {{- arg_name + '=\"' + arg_val + '\"' }}\n {%- if not loop.last %}\n {{- \", \" }}\n {%- endif %}\n {%- endfor %}\n {{- \")\" }}\n {%- else %}\n {{- '<|start_header_id|>assistant<|end_header_id|>\\n\\n' -}}\n {{- '{\"name\": \"' + tool_call.name + '\", ' }}\n {{- '\"parameters\": ' }}\n {{- tool_call.arguments | tojson }}\n {{- \"}\" }}\n {%- endif %}\n {%- if builtin_tools is defined %}\n {#- This means we're in ipython mode #}\n {{- \"<|eom_id|>\" }}\n {%- else %}\n {{- \"<|eot_id|>\" }}\n {%- endif %}\n {%- elif message.role == \"tool\" or message.role == \"ipython\" %}\n {{- \"<|start_header_id|>ipython<|end_header_id|>\\n\\n\" }}\n {%- if message.content is mapping or message.content is iterable %}\n {{- message.content | tojson }}\n {%- else %}\n {{- message.content }}\n {%- endif %}\n {{- \"<|eot_id|>\" }}\n {%- endif %}\n{%- endfor %}\n{%- if add_generation_prompt %}\n {{- '<|start_header_id|>assistant<|end_header_id|>\\n\\n' }}\n{%- endif %}\n",
+ "clean_up_tokenization_spaces": true,
+ "eos_token": "<|eot_id|>",
+ "extra_special_tokens": {},
+ "model_input_names": [
+ "input_ids",
+ "attention_mask"
+ ],
+ "model_max_length": 131072,
+ "pad_token": "<|end_of_text|>",
+ "tokenizer_class": "PreTrainedTokenizer"
+}
diff --git a/checkpoint-425/trainer_state.json b/checkpoint-425/trainer_state.json
new file mode 100644
index 0000000000000000000000000000000000000000..9997d2a765f606882a8fd502c69344b464fddbae
--- /dev/null
+++ b/checkpoint-425/trainer_state.json
@@ -0,0 +1,3008 @@
+{
+ "best_metric": null,
+ "best_model_checkpoint": null,
+ "epoch": 4.94921875,
+ "eval_steps": 500,
+ "global_step": 425,
+ "is_hyper_param_search": false,
+ "is_local_process_zero": true,
+ "is_world_process_zero": true,
+ "log_history": [
+ {
+ "epoch": 0.01171875,
+ "grad_norm": 36.23282241821289,
+ "learning_rate": 5.0000000000000004e-08,
+ "loss": 2.3839,
+ "step": 1
+ },
+ {
+ "epoch": 0.0234375,
+ "grad_norm": 35.918636322021484,
+ "learning_rate": 1.0000000000000001e-07,
+ "loss": 2.3798,
+ "step": 2
+ },
+ {
+ "epoch": 0.03515625,
+ "grad_norm": 35.62618637084961,
+ "learning_rate": 1.5000000000000002e-07,
+ "loss": 2.386,
+ "step": 3
+ },
+ {
+ "epoch": 0.046875,
+ "grad_norm": 35.966087341308594,
+ "learning_rate": 2.0000000000000002e-07,
+ "loss": 2.3803,
+ "step": 4
+ },
+ {
+ "epoch": 0.05859375,
+ "grad_norm": 35.38177490234375,
+ "learning_rate": 2.5000000000000004e-07,
+ "loss": 2.3937,
+ "step": 5
+ },
+ {
+ "epoch": 0.0703125,
+ "grad_norm": 35.99677658081055,
+ "learning_rate": 3.0000000000000004e-07,
+ "loss": 2.3906,
+ "step": 6
+ },
+ {
+ "epoch": 0.08203125,
+ "grad_norm": 35.44341278076172,
+ "learning_rate": 3.5000000000000004e-07,
+ "loss": 2.3539,
+ "step": 7
+ },
+ {
+ "epoch": 0.09375,
+ "grad_norm": 35.300697326660156,
+ "learning_rate": 4.0000000000000003e-07,
+ "loss": 2.3459,
+ "step": 8
+ },
+ {
+ "epoch": 0.10546875,
+ "grad_norm": 34.092952728271484,
+ "learning_rate": 4.5000000000000003e-07,
+ "loss": 2.2959,
+ "step": 9
+ },
+ {
+ "epoch": 0.1171875,
+ "grad_norm": 34.46371841430664,
+ "learning_rate": 5.000000000000001e-07,
+ "loss": 2.2661,
+ "step": 10
+ },
+ {
+ "epoch": 0.12890625,
+ "grad_norm": 34.62260818481445,
+ "learning_rate": 5.5e-07,
+ "loss": 2.2918,
+ "step": 11
+ },
+ {
+ "epoch": 0.140625,
+ "grad_norm": 33.790374755859375,
+ "learning_rate": 6.000000000000001e-07,
+ "loss": 2.223,
+ "step": 12
+ },
+ {
+ "epoch": 0.15234375,
+ "grad_norm": 33.766536712646484,
+ "learning_rate": 6.5e-07,
+ "loss": 2.2267,
+ "step": 13
+ },
+ {
+ "epoch": 0.1640625,
+ "grad_norm": 33.894081115722656,
+ "learning_rate": 7.000000000000001e-07,
+ "loss": 2.1465,
+ "step": 14
+ },
+ {
+ "epoch": 0.17578125,
+ "grad_norm": 33.162452697753906,
+ "learning_rate": 7.5e-07,
+ "loss": 2.0495,
+ "step": 15
+ },
+ {
+ "epoch": 0.1875,
+ "grad_norm": 32.954341888427734,
+ "learning_rate": 8.000000000000001e-07,
+ "loss": 1.9627,
+ "step": 16
+ },
+ {
+ "epoch": 0.19921875,
+ "grad_norm": 33.96324157714844,
+ "learning_rate": 8.500000000000001e-07,
+ "loss": 1.8867,
+ "step": 17
+ },
+ {
+ "epoch": 0.2109375,
+ "grad_norm": 33.81139373779297,
+ "learning_rate": 9.000000000000001e-07,
+ "loss": 1.7752,
+ "step": 18
+ },
+ {
+ "epoch": 0.22265625,
+ "grad_norm": 34.87086868286133,
+ "learning_rate": 9.500000000000001e-07,
+ "loss": 1.6944,
+ "step": 19
+ },
+ {
+ "epoch": 0.234375,
+ "grad_norm": 34.84965133666992,
+ "learning_rate": 1.0000000000000002e-06,
+ "loss": 1.5707,
+ "step": 20
+ },
+ {
+ "epoch": 0.24609375,
+ "grad_norm": 35.227317810058594,
+ "learning_rate": 1.0500000000000001e-06,
+ "loss": 1.4369,
+ "step": 21
+ },
+ {
+ "epoch": 0.2578125,
+ "grad_norm": 34.91344451904297,
+ "learning_rate": 1.1e-06,
+ "loss": 1.3202,
+ "step": 22
+ },
+ {
+ "epoch": 0.26953125,
+ "grad_norm": 31.7376766204834,
+ "learning_rate": 1.1500000000000002e-06,
+ "loss": 1.1398,
+ "step": 23
+ },
+ {
+ "epoch": 0.28125,
+ "grad_norm": 30.24741554260254,
+ "learning_rate": 1.2000000000000002e-06,
+ "loss": 1.0421,
+ "step": 24
+ },
+ {
+ "epoch": 0.29296875,
+ "grad_norm": 28.292400360107422,
+ "learning_rate": 1.25e-06,
+ "loss": 0.8817,
+ "step": 25
+ },
+ {
+ "epoch": 0.3046875,
+ "grad_norm": 30.44672393798828,
+ "learning_rate": 1.3e-06,
+ "loss": 0.7073,
+ "step": 26
+ },
+ {
+ "epoch": 0.31640625,
+ "grad_norm": 29.416427612304688,
+ "learning_rate": 1.3500000000000002e-06,
+ "loss": 0.5444,
+ "step": 27
+ },
+ {
+ "epoch": 0.328125,
+ "grad_norm": 24.820096969604492,
+ "learning_rate": 1.4000000000000001e-06,
+ "loss": 0.4025,
+ "step": 28
+ },
+ {
+ "epoch": 0.33984375,
+ "grad_norm": 21.023277282714844,
+ "learning_rate": 1.45e-06,
+ "loss": 0.307,
+ "step": 29
+ },
+ {
+ "epoch": 0.3515625,
+ "grad_norm": 19.656967163085938,
+ "learning_rate": 1.5e-06,
+ "loss": 0.2151,
+ "step": 30
+ },
+ {
+ "epoch": 0.36328125,
+ "grad_norm": 14.91929817199707,
+ "learning_rate": 1.5500000000000002e-06,
+ "loss": 0.1448,
+ "step": 31
+ },
+ {
+ "epoch": 0.375,
+ "grad_norm": 5.083199977874756,
+ "learning_rate": 1.6000000000000001e-06,
+ "loss": 0.09,
+ "step": 32
+ },
+ {
+ "epoch": 0.38671875,
+ "grad_norm": 2.320681571960449,
+ "learning_rate": 1.6500000000000003e-06,
+ "loss": 0.0641,
+ "step": 33
+ },
+ {
+ "epoch": 0.3984375,
+ "grad_norm": 1.6233159303665161,
+ "learning_rate": 1.7000000000000002e-06,
+ "loss": 0.0584,
+ "step": 34
+ },
+ {
+ "epoch": 0.41015625,
+ "grad_norm": 1.6057201623916626,
+ "learning_rate": 1.75e-06,
+ "loss": 0.0626,
+ "step": 35
+ },
+ {
+ "epoch": 0.421875,
+ "grad_norm": 1.8360320329666138,
+ "learning_rate": 1.8000000000000001e-06,
+ "loss": 0.0563,
+ "step": 36
+ },
+ {
+ "epoch": 0.43359375,
+ "grad_norm": 1.736350178718567,
+ "learning_rate": 1.85e-06,
+ "loss": 0.0609,
+ "step": 37
+ },
+ {
+ "epoch": 0.4453125,
+ "grad_norm": 1.1473922729492188,
+ "learning_rate": 1.9000000000000002e-06,
+ "loss": 0.0541,
+ "step": 38
+ },
+ {
+ "epoch": 0.45703125,
+ "grad_norm": 1.1722168922424316,
+ "learning_rate": 1.9500000000000004e-06,
+ "loss": 0.0534,
+ "step": 39
+ },
+ {
+ "epoch": 0.46875,
+ "grad_norm": 1.356987714767456,
+ "learning_rate": 2.0000000000000003e-06,
+ "loss": 0.0496,
+ "step": 40
+ },
+ {
+ "epoch": 0.48046875,
+ "grad_norm": 0.8023216724395752,
+ "learning_rate": 2.05e-06,
+ "loss": 0.0527,
+ "step": 41
+ },
+ {
+ "epoch": 0.4921875,
+ "grad_norm": 0.9803515672683716,
+ "learning_rate": 2.1000000000000002e-06,
+ "loss": 0.0478,
+ "step": 42
+ },
+ {
+ "epoch": 0.50390625,
+ "grad_norm": 0.8733468651771545,
+ "learning_rate": 2.15e-06,
+ "loss": 0.052,
+ "step": 43
+ },
+ {
+ "epoch": 0.515625,
+ "grad_norm": 0.8213743567466736,
+ "learning_rate": 2.2e-06,
+ "loss": 0.0448,
+ "step": 44
+ },
+ {
+ "epoch": 0.52734375,
+ "grad_norm": 0.843189537525177,
+ "learning_rate": 2.25e-06,
+ "loss": 0.0498,
+ "step": 45
+ },
+ {
+ "epoch": 0.5390625,
+ "grad_norm": 0.8801079392433167,
+ "learning_rate": 2.3000000000000004e-06,
+ "loss": 0.0408,
+ "step": 46
+ },
+ {
+ "epoch": 0.55078125,
+ "grad_norm": 0.7131401300430298,
+ "learning_rate": 2.35e-06,
+ "loss": 0.0405,
+ "step": 47
+ },
+ {
+ "epoch": 0.5625,
+ "grad_norm": 0.8996126651763916,
+ "learning_rate": 2.4000000000000003e-06,
+ "loss": 0.0525,
+ "step": 48
+ },
+ {
+ "epoch": 0.57421875,
+ "grad_norm": 0.8606986403465271,
+ "learning_rate": 2.4500000000000003e-06,
+ "loss": 0.0438,
+ "step": 49
+ },
+ {
+ "epoch": 0.5859375,
+ "grad_norm": 0.6918051838874817,
+ "learning_rate": 2.5e-06,
+ "loss": 0.0394,
+ "step": 50
+ },
+ {
+ "epoch": 0.59765625,
+ "grad_norm": 0.6177802085876465,
+ "learning_rate": 2.55e-06,
+ "loss": 0.0387,
+ "step": 51
+ },
+ {
+ "epoch": 0.609375,
+ "grad_norm": 0.7042555809020996,
+ "learning_rate": 2.6e-06,
+ "loss": 0.0434,
+ "step": 52
+ },
+ {
+ "epoch": 0.62109375,
+ "grad_norm": 0.6537717580795288,
+ "learning_rate": 2.6500000000000005e-06,
+ "loss": 0.0396,
+ "step": 53
+ },
+ {
+ "epoch": 0.6328125,
+ "grad_norm": 0.7834082841873169,
+ "learning_rate": 2.7000000000000004e-06,
+ "loss": 0.0411,
+ "step": 54
+ },
+ {
+ "epoch": 0.64453125,
+ "grad_norm": 0.7287272810935974,
+ "learning_rate": 2.7500000000000004e-06,
+ "loss": 0.0408,
+ "step": 55
+ },
+ {
+ "epoch": 0.65625,
+ "grad_norm": 0.7186263203620911,
+ "learning_rate": 2.8000000000000003e-06,
+ "loss": 0.0394,
+ "step": 56
+ },
+ {
+ "epoch": 0.66796875,
+ "grad_norm": 0.7264899611473083,
+ "learning_rate": 2.85e-06,
+ "loss": 0.0427,
+ "step": 57
+ },
+ {
+ "epoch": 0.6796875,
+ "grad_norm": 0.7665618062019348,
+ "learning_rate": 2.9e-06,
+ "loss": 0.0368,
+ "step": 58
+ },
+ {
+ "epoch": 0.69140625,
+ "grad_norm": 0.7222962379455566,
+ "learning_rate": 2.95e-06,
+ "loss": 0.0412,
+ "step": 59
+ },
+ {
+ "epoch": 0.703125,
+ "grad_norm": 0.7061101794242859,
+ "learning_rate": 3e-06,
+ "loss": 0.0377,
+ "step": 60
+ },
+ {
+ "epoch": 0.71484375,
+ "grad_norm": 0.5724324584007263,
+ "learning_rate": 3.05e-06,
+ "loss": 0.0387,
+ "step": 61
+ },
+ {
+ "epoch": 0.7265625,
+ "grad_norm": 0.5535506010055542,
+ "learning_rate": 3.1000000000000004e-06,
+ "loss": 0.0403,
+ "step": 62
+ },
+ {
+ "epoch": 0.73828125,
+ "grad_norm": 0.6553678512573242,
+ "learning_rate": 3.1500000000000003e-06,
+ "loss": 0.0415,
+ "step": 63
+ },
+ {
+ "epoch": 0.75,
+ "grad_norm": 0.6137285828590393,
+ "learning_rate": 3.2000000000000003e-06,
+ "loss": 0.0383,
+ "step": 64
+ },
+ {
+ "epoch": 0.76171875,
+ "grad_norm": 0.5985754132270813,
+ "learning_rate": 3.2500000000000002e-06,
+ "loss": 0.0355,
+ "step": 65
+ },
+ {
+ "epoch": 0.7734375,
+ "grad_norm": 0.5903909802436829,
+ "learning_rate": 3.3000000000000006e-06,
+ "loss": 0.0374,
+ "step": 66
+ },
+ {
+ "epoch": 0.78515625,
+ "grad_norm": 0.5718765258789062,
+ "learning_rate": 3.3500000000000005e-06,
+ "loss": 0.0339,
+ "step": 67
+ },
+ {
+ "epoch": 0.796875,
+ "grad_norm": 0.6844965815544128,
+ "learning_rate": 3.4000000000000005e-06,
+ "loss": 0.0405,
+ "step": 68
+ },
+ {
+ "epoch": 0.80859375,
+ "grad_norm": 0.5959618091583252,
+ "learning_rate": 3.45e-06,
+ "loss": 0.0338,
+ "step": 69
+ },
+ {
+ "epoch": 0.8203125,
+ "grad_norm": 0.6095123291015625,
+ "learning_rate": 3.5e-06,
+ "loss": 0.0362,
+ "step": 70
+ },
+ {
+ "epoch": 0.83203125,
+ "grad_norm": 0.543708086013794,
+ "learning_rate": 3.5500000000000003e-06,
+ "loss": 0.0355,
+ "step": 71
+ },
+ {
+ "epoch": 0.84375,
+ "grad_norm": 0.6969983577728271,
+ "learning_rate": 3.6000000000000003e-06,
+ "loss": 0.0325,
+ "step": 72
+ },
+ {
+ "epoch": 0.85546875,
+ "grad_norm": 0.6022969484329224,
+ "learning_rate": 3.65e-06,
+ "loss": 0.0342,
+ "step": 73
+ },
+ {
+ "epoch": 0.8671875,
+ "grad_norm": 0.6262147426605225,
+ "learning_rate": 3.7e-06,
+ "loss": 0.0348,
+ "step": 74
+ },
+ {
+ "epoch": 0.87890625,
+ "grad_norm": 0.5729933381080627,
+ "learning_rate": 3.7500000000000005e-06,
+ "loss": 0.0318,
+ "step": 75
+ },
+ {
+ "epoch": 0.890625,
+ "grad_norm": 0.5846775770187378,
+ "learning_rate": 3.8000000000000005e-06,
+ "loss": 0.0309,
+ "step": 76
+ },
+ {
+ "epoch": 0.90234375,
+ "grad_norm": 0.6469219923019409,
+ "learning_rate": 3.85e-06,
+ "loss": 0.0324,
+ "step": 77
+ },
+ {
+ "epoch": 0.9140625,
+ "grad_norm": 0.6574859023094177,
+ "learning_rate": 3.900000000000001e-06,
+ "loss": 0.0325,
+ "step": 78
+ },
+ {
+ "epoch": 0.92578125,
+ "grad_norm": 0.5833832025527954,
+ "learning_rate": 3.95e-06,
+ "loss": 0.0232,
+ "step": 79
+ },
+ {
+ "epoch": 0.9375,
+ "grad_norm": 0.7503570318222046,
+ "learning_rate": 4.000000000000001e-06,
+ "loss": 0.0267,
+ "step": 80
+ },
+ {
+ "epoch": 0.94921875,
+ "grad_norm": 0.7181633114814758,
+ "learning_rate": 4.05e-06,
+ "loss": 0.0304,
+ "step": 81
+ },
+ {
+ "epoch": 0.9609375,
+ "grad_norm": 0.6477274298667908,
+ "learning_rate": 4.1e-06,
+ "loss": 0.0297,
+ "step": 82
+ },
+ {
+ "epoch": 0.97265625,
+ "grad_norm": 0.6768563389778137,
+ "learning_rate": 4.15e-06,
+ "loss": 0.0279,
+ "step": 83
+ },
+ {
+ "epoch": 0.984375,
+ "grad_norm": 0.7905837297439575,
+ "learning_rate": 4.2000000000000004e-06,
+ "loss": 0.0301,
+ "step": 84
+ },
+ {
+ "epoch": 0.99609375,
+ "grad_norm": 0.5576608777046204,
+ "learning_rate": 4.25e-06,
+ "loss": 0.0322,
+ "step": 85
+ },
+ {
+ "epoch": 1.0,
+ "grad_norm": 0.5576608777046204,
+ "learning_rate": 4.3e-06,
+ "loss": 0.0226,
+ "step": 86
+ },
+ {
+ "epoch": 1.01171875,
+ "grad_norm": 1.0774812698364258,
+ "learning_rate": 4.350000000000001e-06,
+ "loss": 0.0215,
+ "step": 87
+ },
+ {
+ "epoch": 1.0234375,
+ "grad_norm": 0.47373324632644653,
+ "learning_rate": 4.4e-06,
+ "loss": 0.0235,
+ "step": 88
+ },
+ {
+ "epoch": 1.03515625,
+ "grad_norm": 0.7665970325469971,
+ "learning_rate": 4.450000000000001e-06,
+ "loss": 0.0242,
+ "step": 89
+ },
+ {
+ "epoch": 1.046875,
+ "grad_norm": 0.6290147304534912,
+ "learning_rate": 4.5e-06,
+ "loss": 0.0209,
+ "step": 90
+ },
+ {
+ "epoch": 1.05859375,
+ "grad_norm": 0.5703024864196777,
+ "learning_rate": 4.5500000000000005e-06,
+ "loss": 0.0192,
+ "step": 91
+ },
+ {
+ "epoch": 1.0703125,
+ "grad_norm": 0.6099259853363037,
+ "learning_rate": 4.600000000000001e-06,
+ "loss": 0.0181,
+ "step": 92
+ },
+ {
+ "epoch": 1.08203125,
+ "grad_norm": 0.6570988297462463,
+ "learning_rate": 4.65e-06,
+ "loss": 0.0201,
+ "step": 93
+ },
+ {
+ "epoch": 1.09375,
+ "grad_norm": 0.7848325371742249,
+ "learning_rate": 4.7e-06,
+ "loss": 0.0253,
+ "step": 94
+ },
+ {
+ "epoch": 1.10546875,
+ "grad_norm": 0.6759209036827087,
+ "learning_rate": 4.75e-06,
+ "loss": 0.0195,
+ "step": 95
+ },
+ {
+ "epoch": 1.1171875,
+ "grad_norm": 0.4861151874065399,
+ "learning_rate": 4.800000000000001e-06,
+ "loss": 0.0191,
+ "step": 96
+ },
+ {
+ "epoch": 1.12890625,
+ "grad_norm": 0.6268576383590698,
+ "learning_rate": 4.85e-06,
+ "loss": 0.0211,
+ "step": 97
+ },
+ {
+ "epoch": 1.140625,
+ "grad_norm": 0.5862017869949341,
+ "learning_rate": 4.9000000000000005e-06,
+ "loss": 0.0177,
+ "step": 98
+ },
+ {
+ "epoch": 1.15234375,
+ "grad_norm": 0.4569724202156067,
+ "learning_rate": 4.95e-06,
+ "loss": 0.0164,
+ "step": 99
+ },
+ {
+ "epoch": 1.1640625,
+ "grad_norm": 0.4539048969745636,
+ "learning_rate": 5e-06,
+ "loss": 0.0152,
+ "step": 100
+ },
+ {
+ "epoch": 1.17578125,
+ "grad_norm": 0.4553528428077698,
+ "learning_rate": 4.999926609487568e-06,
+ "loss": 0.0208,
+ "step": 101
+ },
+ {
+ "epoch": 1.1875,
+ "grad_norm": 0.5182592272758484,
+ "learning_rate": 4.999706442259205e-06,
+ "loss": 0.0154,
+ "step": 102
+ },
+ {
+ "epoch": 1.19921875,
+ "grad_norm": 0.5602673888206482,
+ "learning_rate": 4.999339511241458e-06,
+ "loss": 0.0196,
+ "step": 103
+ },
+ {
+ "epoch": 1.2109375,
+ "grad_norm": 0.7579494118690491,
+ "learning_rate": 4.9988258379777334e-06,
+ "loss": 0.0198,
+ "step": 104
+ },
+ {
+ "epoch": 1.22265625,
+ "grad_norm": 0.603757381439209,
+ "learning_rate": 4.998165452627025e-06,
+ "loss": 0.0185,
+ "step": 105
+ },
+ {
+ "epoch": 1.234375,
+ "grad_norm": 0.5520291924476624,
+ "learning_rate": 4.99735839396215e-06,
+ "loss": 0.018,
+ "step": 106
+ },
+ {
+ "epoch": 1.24609375,
+ "grad_norm": 0.55808424949646,
+ "learning_rate": 4.996404709367466e-06,
+ "loss": 0.0159,
+ "step": 107
+ },
+ {
+ "epoch": 1.2578125,
+ "grad_norm": 0.47174298763275146,
+ "learning_rate": 4.995304454836095e-06,
+ "loss": 0.0122,
+ "step": 108
+ },
+ {
+ "epoch": 1.26953125,
+ "grad_norm": 0.5289337038993835,
+ "learning_rate": 4.994057694966632e-06,
+ "loss": 0.0168,
+ "step": 109
+ },
+ {
+ "epoch": 1.28125,
+ "grad_norm": 0.5390430092811584,
+ "learning_rate": 4.992664502959351e-06,
+ "loss": 0.017,
+ "step": 110
+ },
+ {
+ "epoch": 1.29296875,
+ "grad_norm": 0.4966451823711395,
+ "learning_rate": 4.991124960611916e-06,
+ "loss": 0.0145,
+ "step": 111
+ },
+ {
+ "epoch": 1.3046875,
+ "grad_norm": 0.6148604154586792,
+ "learning_rate": 4.989439158314566e-06,
+ "loss": 0.0139,
+ "step": 112
+ },
+ {
+ "epoch": 1.31640625,
+ "grad_norm": 0.6303534507751465,
+ "learning_rate": 4.9876071950448185e-06,
+ "loss": 0.0118,
+ "step": 113
+ },
+ {
+ "epoch": 1.328125,
+ "grad_norm": 0.5410207509994507,
+ "learning_rate": 4.98562917836165e-06,
+ "loss": 0.0094,
+ "step": 114
+ },
+ {
+ "epoch": 1.33984375,
+ "grad_norm": 0.5350080132484436,
+ "learning_rate": 4.983505224399188e-06,
+ "loss": 0.0158,
+ "step": 115
+ },
+ {
+ "epoch": 1.3515625,
+ "grad_norm": 1.017317295074463,
+ "learning_rate": 4.9812354578598876e-06,
+ "loss": 0.0201,
+ "step": 116
+ },
+ {
+ "epoch": 1.36328125,
+ "grad_norm": 0.6891007423400879,
+ "learning_rate": 4.978820012007213e-06,
+ "loss": 0.0127,
+ "step": 117
+ },
+ {
+ "epoch": 1.375,
+ "grad_norm": 0.4756389260292053,
+ "learning_rate": 4.976259028657812e-06,
+ "loss": 0.0188,
+ "step": 118
+ },
+ {
+ "epoch": 1.38671875,
+ "grad_norm": 0.5957350730895996,
+ "learning_rate": 4.973552658173186e-06,
+ "loss": 0.011,
+ "step": 119
+ },
+ {
+ "epoch": 1.3984375,
+ "grad_norm": 0.5012223720550537,
+ "learning_rate": 4.970701059450872e-06,
+ "loss": 0.0138,
+ "step": 120
+ },
+ {
+ "epoch": 1.41015625,
+ "grad_norm": 0.4408419132232666,
+ "learning_rate": 4.9677043999151e-06,
+ "loss": 0.0144,
+ "step": 121
+ },
+ {
+ "epoch": 1.421875,
+ "grad_norm": 0.5721736550331116,
+ "learning_rate": 4.964562855506976e-06,
+ "loss": 0.0135,
+ "step": 122
+ },
+ {
+ "epoch": 1.43359375,
+ "grad_norm": 0.5479208827018738,
+ "learning_rate": 4.961276610674141e-06,
+ "loss": 0.0128,
+ "step": 123
+ },
+ {
+ "epoch": 1.4453125,
+ "grad_norm": 1.0117675065994263,
+ "learning_rate": 4.9578458583599495e-06,
+ "loss": 0.0111,
+ "step": 124
+ },
+ {
+ "epoch": 1.45703125,
+ "grad_norm": 0.5504026412963867,
+ "learning_rate": 4.954270799992138e-06,
+ "loss": 0.0083,
+ "step": 125
+ },
+ {
+ "epoch": 1.46875,
+ "grad_norm": 0.48403099179267883,
+ "learning_rate": 4.950551645470998e-06,
+ "loss": 0.0083,
+ "step": 126
+ },
+ {
+ "epoch": 1.48046875,
+ "grad_norm": 0.6866800785064697,
+ "learning_rate": 4.9466886131570565e-06,
+ "loss": 0.0085,
+ "step": 127
+ },
+ {
+ "epoch": 1.4921875,
+ "grad_norm": 0.872557520866394,
+ "learning_rate": 4.942681929858249e-06,
+ "loss": 0.0102,
+ "step": 128
+ },
+ {
+ "epoch": 1.50390625,
+ "grad_norm": 0.6924716234207153,
+ "learning_rate": 4.9385318308166065e-06,
+ "loss": 0.012,
+ "step": 129
+ },
+ {
+ "epoch": 1.515625,
+ "grad_norm": 0.5060118436813354,
+ "learning_rate": 4.934238559694448e-06,
+ "loss": 0.0084,
+ "step": 130
+ },
+ {
+ "epoch": 1.52734375,
+ "grad_norm": 0.6256171464920044,
+ "learning_rate": 4.929802368560066e-06,
+ "loss": 0.0081,
+ "step": 131
+ },
+ {
+ "epoch": 1.5390625,
+ "grad_norm": 0.5422537922859192,
+ "learning_rate": 4.925223517872934e-06,
+ "loss": 0.0077,
+ "step": 132
+ },
+ {
+ "epoch": 1.55078125,
+ "grad_norm": 0.953416109085083,
+ "learning_rate": 4.920502276468408e-06,
+ "loss": 0.0078,
+ "step": 133
+ },
+ {
+ "epoch": 1.5625,
+ "grad_norm": 0.4540804624557495,
+ "learning_rate": 4.915638921541952e-06,
+ "loss": 0.0097,
+ "step": 134
+ },
+ {
+ "epoch": 1.57421875,
+ "grad_norm": 0.3773641884326935,
+ "learning_rate": 4.9106337386328524e-06,
+ "loss": 0.0098,
+ "step": 135
+ },
+ {
+ "epoch": 1.5859375,
+ "grad_norm": 0.7970175743103027,
+ "learning_rate": 4.905487021607462e-06,
+ "loss": 0.0056,
+ "step": 136
+ },
+ {
+ "epoch": 1.59765625,
+ "grad_norm": 0.45197635889053345,
+ "learning_rate": 4.900199072641937e-06,
+ "loss": 0.0078,
+ "step": 137
+ },
+ {
+ "epoch": 1.609375,
+ "grad_norm": 0.38231438398361206,
+ "learning_rate": 4.894770202204509e-06,
+ "loss": 0.0072,
+ "step": 138
+ },
+ {
+ "epoch": 1.62109375,
+ "grad_norm": 0.2945426404476166,
+ "learning_rate": 4.889200729037241e-06,
+ "loss": 0.0086,
+ "step": 139
+ },
+ {
+ "epoch": 1.6328125,
+ "grad_norm": 0.49699363112449646,
+ "learning_rate": 4.883490980137327e-06,
+ "loss": 0.0073,
+ "step": 140
+ },
+ {
+ "epoch": 1.64453125,
+ "grad_norm": 0.38112956285476685,
+ "learning_rate": 4.8776412907378845e-06,
+ "loss": 0.0056,
+ "step": 141
+ },
+ {
+ "epoch": 1.65625,
+ "grad_norm": 0.46780407428741455,
+ "learning_rate": 4.871652004288275e-06,
+ "loss": 0.0078,
+ "step": 142
+ },
+ {
+ "epoch": 1.66796875,
+ "grad_norm": 0.43764325976371765,
+ "learning_rate": 4.865523472433942e-06,
+ "loss": 0.005,
+ "step": 143
+ },
+ {
+ "epoch": 1.6796875,
+ "grad_norm": 0.3445664644241333,
+ "learning_rate": 4.859256054995758e-06,
+ "loss": 0.0069,
+ "step": 144
+ },
+ {
+ "epoch": 1.69140625,
+ "grad_norm": 0.40410447120666504,
+ "learning_rate": 4.8528501199489045e-06,
+ "loss": 0.0088,
+ "step": 145
+ },
+ {
+ "epoch": 1.703125,
+ "grad_norm": 0.5876736640930176,
+ "learning_rate": 4.846306043401268e-06,
+ "loss": 0.0057,
+ "step": 146
+ },
+ {
+ "epoch": 1.71484375,
+ "grad_norm": 0.5149250626564026,
+ "learning_rate": 4.839624209571352e-06,
+ "loss": 0.0056,
+ "step": 147
+ },
+ {
+ "epoch": 1.7265625,
+ "grad_norm": 0.7009180784225464,
+ "learning_rate": 4.832805010765724e-06,
+ "loss": 0.0088,
+ "step": 148
+ },
+ {
+ "epoch": 1.73828125,
+ "grad_norm": 0.42258334159851074,
+ "learning_rate": 4.8258488473559794e-06,
+ "loss": 0.004,
+ "step": 149
+ },
+ {
+ "epoch": 1.75,
+ "grad_norm": 0.39231887459754944,
+ "learning_rate": 4.8187561277552376e-06,
+ "loss": 0.005,
+ "step": 150
+ },
+ {
+ "epoch": 1.76171875,
+ "grad_norm": 0.3317432701587677,
+ "learning_rate": 4.811527268394157e-06,
+ "loss": 0.0038,
+ "step": 151
+ },
+ {
+ "epoch": 1.7734375,
+ "grad_norm": 0.5022267699241638,
+ "learning_rate": 4.804162693696494e-06,
+ "loss": 0.0056,
+ "step": 152
+ },
+ {
+ "epoch": 1.78515625,
+ "grad_norm": 0.39019322395324707,
+ "learning_rate": 4.796662836054176e-06,
+ "loss": 0.0053,
+ "step": 153
+ },
+ {
+ "epoch": 1.796875,
+ "grad_norm": 0.5674042701721191,
+ "learning_rate": 4.789028135801919e-06,
+ "loss": 0.007,
+ "step": 154
+ },
+ {
+ "epoch": 1.80859375,
+ "grad_norm": 0.5690024495124817,
+ "learning_rate": 4.7812590411913755e-06,
+ "loss": 0.0053,
+ "step": 155
+ },
+ {
+ "epoch": 1.8203125,
+ "grad_norm": 0.23775412142276764,
+ "learning_rate": 4.773356008364812e-06,
+ "loss": 0.0031,
+ "step": 156
+ },
+ {
+ "epoch": 1.83203125,
+ "grad_norm": 0.4698558747768402,
+ "learning_rate": 4.765319501328332e-06,
+ "loss": 0.0021,
+ "step": 157
+ },
+ {
+ "epoch": 1.84375,
+ "grad_norm": 0.21603639423847198,
+ "learning_rate": 4.757149991924633e-06,
+ "loss": 0.0046,
+ "step": 158
+ },
+ {
+ "epoch": 1.85546875,
+ "grad_norm": 0.33830726146698,
+ "learning_rate": 4.748847959805297e-06,
+ "loss": 0.0022,
+ "step": 159
+ },
+ {
+ "epoch": 1.8671875,
+ "grad_norm": 0.44919782876968384,
+ "learning_rate": 4.740413892402639e-06,
+ "loss": 0.0032,
+ "step": 160
+ },
+ {
+ "epoch": 1.87890625,
+ "grad_norm": 0.5119614601135254,
+ "learning_rate": 4.731848284901082e-06,
+ "loss": 0.006,
+ "step": 161
+ },
+ {
+ "epoch": 1.890625,
+ "grad_norm": 0.3875437080860138,
+ "learning_rate": 4.723151640208084e-06,
+ "loss": 0.0024,
+ "step": 162
+ },
+ {
+ "epoch": 1.90234375,
+ "grad_norm": 0.3179910182952881,
+ "learning_rate": 4.714324468924614e-06,
+ "loss": 0.0037,
+ "step": 163
+ },
+ {
+ "epoch": 1.9140625,
+ "grad_norm": 0.43395644426345825,
+ "learning_rate": 4.705367289315172e-06,
+ "loss": 0.0027,
+ "step": 164
+ },
+ {
+ "epoch": 1.92578125,
+ "grad_norm": 0.3703945577144623,
+ "learning_rate": 4.696280627277356e-06,
+ "loss": 0.0047,
+ "step": 165
+ },
+ {
+ "epoch": 1.9375,
+ "grad_norm": 0.2503529191017151,
+ "learning_rate": 4.687065016310996e-06,
+ "loss": 0.0052,
+ "step": 166
+ },
+ {
+ "epoch": 1.94921875,
+ "grad_norm": 0.3613075315952301,
+ "learning_rate": 4.6777209974868194e-06,
+ "loss": 0.0034,
+ "step": 167
+ },
+ {
+ "epoch": 1.9609375,
+ "grad_norm": 0.3578515350818634,
+ "learning_rate": 4.668249119414692e-06,
+ "loss": 0.0021,
+ "step": 168
+ },
+ {
+ "epoch": 1.97265625,
+ "grad_norm": 0.1784515529870987,
+ "learning_rate": 4.6586499382113985e-06,
+ "loss": 0.0018,
+ "step": 169
+ },
+ {
+ "epoch": 1.984375,
+ "grad_norm": 0.259198397397995,
+ "learning_rate": 4.648924017468003e-06,
+ "loss": 0.0009,
+ "step": 170
+ },
+ {
+ "epoch": 1.99609375,
+ "grad_norm": 0.7194133400917053,
+ "learning_rate": 4.6390719282167515e-06,
+ "loss": 0.0041,
+ "step": 171
+ },
+ {
+ "epoch": 2.0,
+ "grad_norm": 0.7194133400917053,
+ "learning_rate": 4.629094248897546e-06,
+ "loss": 0.0014,
+ "step": 172
+ },
+ {
+ "epoch": 2.01171875,
+ "grad_norm": 0.5032601952552795,
+ "learning_rate": 4.618991565323987e-06,
+ "loss": 0.0028,
+ "step": 173
+ },
+ {
+ "epoch": 2.0234375,
+ "grad_norm": 0.6387512683868408,
+ "learning_rate": 4.608764470648971e-06,
+ "loss": 0.0007,
+ "step": 174
+ },
+ {
+ "epoch": 2.03515625,
+ "grad_norm": 0.23177844285964966,
+ "learning_rate": 4.598413565329876e-06,
+ "loss": 0.0006,
+ "step": 175
+ },
+ {
+ "epoch": 2.046875,
+ "grad_norm": 0.1713147759437561,
+ "learning_rate": 4.587939457093296e-06,
+ "loss": 0.0003,
+ "step": 176
+ },
+ {
+ "epoch": 2.05859375,
+ "grad_norm": 0.06128697097301483,
+ "learning_rate": 4.577342760899368e-06,
+ "loss": 0.0001,
+ "step": 177
+ },
+ {
+ "epoch": 2.0703125,
+ "grad_norm": 0.538530170917511,
+ "learning_rate": 4.566624098905665e-06,
+ "loss": 0.0004,
+ "step": 178
+ },
+ {
+ "epoch": 2.08203125,
+ "grad_norm": 0.03301696106791496,
+ "learning_rate": 4.555784100430662e-06,
+ "loss": 0.0004,
+ "step": 179
+ },
+ {
+ "epoch": 2.09375,
+ "grad_norm": 0.21366432309150696,
+ "learning_rate": 4.544823401916794e-06,
+ "loss": 0.0014,
+ "step": 180
+ },
+ {
+ "epoch": 2.10546875,
+ "grad_norm": 0.13440090417861938,
+ "learning_rate": 4.533742646893086e-06,
+ "loss": 0.0004,
+ "step": 181
+ },
+ {
+ "epoch": 2.1171875,
+ "grad_norm": 0.531997799873352,
+ "learning_rate": 4.522542485937369e-06,
+ "loss": 0.0008,
+ "step": 182
+ },
+ {
+ "epoch": 2.12890625,
+ "grad_norm": 0.2832719385623932,
+ "learning_rate": 4.511223576638084e-06,
+ "loss": 0.0023,
+ "step": 183
+ },
+ {
+ "epoch": 2.140625,
+ "grad_norm": 0.3814002275466919,
+ "learning_rate": 4.499786583555675e-06,
+ "loss": 0.001,
+ "step": 184
+ },
+ {
+ "epoch": 2.15234375,
+ "grad_norm": 0.2522885501384735,
+ "learning_rate": 4.4882321781835666e-06,
+ "loss": 0.0004,
+ "step": 185
+ },
+ {
+ "epoch": 2.1640625,
+ "grad_norm": 0.3866797983646393,
+ "learning_rate": 4.476561038908745e-06,
+ "loss": 0.0007,
+ "step": 186
+ },
+ {
+ "epoch": 2.17578125,
+ "grad_norm": 0.2128417044878006,
+ "learning_rate": 4.464773850971924e-06,
+ "loss": 0.0001,
+ "step": 187
+ },
+ {
+ "epoch": 2.1875,
+ "grad_norm": 0.135880708694458,
+ "learning_rate": 4.452871306427314e-06,
+ "loss": 0.0031,
+ "step": 188
+ },
+ {
+ "epoch": 2.19921875,
+ "grad_norm": 0.38835451006889343,
+ "learning_rate": 4.440854104101988e-06,
+ "loss": 0.0015,
+ "step": 189
+ },
+ {
+ "epoch": 2.2109375,
+ "grad_norm": 0.18233123421669006,
+ "learning_rate": 4.428722949554858e-06,
+ "loss": 0.0001,
+ "step": 190
+ },
+ {
+ "epoch": 2.22265625,
+ "grad_norm": 0.10753051191568375,
+ "learning_rate": 4.416478555035241e-06,
+ "loss": 0.0017,
+ "step": 191
+ },
+ {
+ "epoch": 2.234375,
+ "grad_norm": 0.30138343572616577,
+ "learning_rate": 4.404121639441047e-06,
+ "loss": 0.0004,
+ "step": 192
+ },
+ {
+ "epoch": 2.24609375,
+ "grad_norm": 0.12771356105804443,
+ "learning_rate": 4.391652928276572e-06,
+ "loss": 0.0022,
+ "step": 193
+ },
+ {
+ "epoch": 2.2578125,
+ "grad_norm": 0.4173564612865448,
+ "learning_rate": 4.379073153609896e-06,
+ "loss": 0.0001,
+ "step": 194
+ },
+ {
+ "epoch": 2.26953125,
+ "grad_norm": 0.08329658955335617,
+ "learning_rate": 4.366383054029907e-06,
+ "loss": 0.0009,
+ "step": 195
+ },
+ {
+ "epoch": 2.28125,
+ "grad_norm": 0.21187439560890198,
+ "learning_rate": 4.3535833746029335e-06,
+ "loss": 0.0013,
+ "step": 196
+ },
+ {
+ "epoch": 2.29296875,
+ "grad_norm": 0.046030864119529724,
+ "learning_rate": 4.340674866829001e-06,
+ "loss": 0.0004,
+ "step": 197
+ },
+ {
+ "epoch": 2.3046875,
+ "grad_norm": 0.08373020589351654,
+ "learning_rate": 4.32765828859771e-06,
+ "loss": 0.0014,
+ "step": 198
+ },
+ {
+ "epoch": 2.31640625,
+ "grad_norm": 0.4026390314102173,
+ "learning_rate": 4.314534404143738e-06,
+ "loss": 0.0003,
+ "step": 199
+ },
+ {
+ "epoch": 2.328125,
+ "grad_norm": 0.24255593121051788,
+ "learning_rate": 4.3013039840019675e-06,
+ "loss": 0.0009,
+ "step": 200
+ },
+ {
+ "epoch": 2.33984375,
+ "grad_norm": 0.2282780110836029,
+ "learning_rate": 4.287967804962252e-06,
+ "loss": 0.0025,
+ "step": 201
+ },
+ {
+ "epoch": 2.3515625,
+ "grad_norm": 0.14743350446224213,
+ "learning_rate": 4.274526650023801e-06,
+ "loss": 0.0014,
+ "step": 202
+ },
+ {
+ "epoch": 2.36328125,
+ "grad_norm": 0.17971713840961456,
+ "learning_rate": 4.260981308349214e-06,
+ "loss": 0.0003,
+ "step": 203
+ },
+ {
+ "epoch": 2.375,
+ "grad_norm": 0.03872796148061752,
+ "learning_rate": 4.247332575218144e-06,
+ "loss": 0.0003,
+ "step": 204
+ },
+ {
+ "epoch": 2.38671875,
+ "grad_norm": 0.06636863946914673,
+ "learning_rate": 4.233581251980604e-06,
+ "loss": 0.0004,
+ "step": 205
+ },
+ {
+ "epoch": 2.3984375,
+ "grad_norm": 0.1254304051399231,
+ "learning_rate": 4.2197281460099245e-06,
+ "loss": 0.0002,
+ "step": 206
+ },
+ {
+ "epoch": 2.41015625,
+ "grad_norm": 0.03998701646924019,
+ "learning_rate": 4.2057740706553415e-06,
+ "loss": 0.0007,
+ "step": 207
+ },
+ {
+ "epoch": 2.421875,
+ "grad_norm": 0.8734745979309082,
+ "learning_rate": 4.191719845194246e-06,
+ "loss": 0.0019,
+ "step": 208
+ },
+ {
+ "epoch": 2.43359375,
+ "grad_norm": 0.34975236654281616,
+ "learning_rate": 4.177566294784085e-06,
+ "loss": 0.0006,
+ "step": 209
+ },
+ {
+ "epoch": 2.4453125,
+ "grad_norm": 0.07566183060407639,
+ "learning_rate": 4.163314250413913e-06,
+ "loss": 0.0003,
+ "step": 210
+ },
+ {
+ "epoch": 2.45703125,
+ "grad_norm": 0.09056711941957474,
+ "learning_rate": 4.148964548855603e-06,
+ "loss": 0.0002,
+ "step": 211
+ },
+ {
+ "epoch": 2.46875,
+ "grad_norm": 0.16160684823989868,
+ "learning_rate": 4.134518032614713e-06,
+ "loss": 0.0009,
+ "step": 212
+ },
+ {
+ "epoch": 2.48046875,
+ "grad_norm": 0.0812753438949585,
+ "learning_rate": 4.119975549881029e-06,
+ "loss": 0.0002,
+ "step": 213
+ },
+ {
+ "epoch": 2.4921875,
+ "grad_norm": 0.05827738344669342,
+ "learning_rate": 4.105337954478756e-06,
+ "loss": 0.0007,
+ "step": 214
+ },
+ {
+ "epoch": 2.50390625,
+ "grad_norm": 0.2625848054885864,
+ "learning_rate": 4.0906061058164e-06,
+ "loss": 0.0003,
+ "step": 215
+ },
+ {
+ "epoch": 2.515625,
+ "grad_norm": 0.1771923154592514,
+ "learning_rate": 4.075780868836296e-06,
+ "loss": 0.0005,
+ "step": 216
+ },
+ {
+ "epoch": 2.52734375,
+ "grad_norm": 0.034166041761636734,
+ "learning_rate": 4.060863113963835e-06,
+ "loss": 0.0012,
+ "step": 217
+ },
+ {
+ "epoch": 2.5390625,
+ "grad_norm": 0.14099521934986115,
+ "learning_rate": 4.045853717056358e-06,
+ "loss": 0.0,
+ "step": 218
+ },
+ {
+ "epoch": 2.55078125,
+ "grad_norm": 0.34704917669296265,
+ "learning_rate": 4.030753559351728e-06,
+ "loss": 0.0006,
+ "step": 219
+ },
+ {
+ "epoch": 2.5625,
+ "grad_norm": 0.25681111216545105,
+ "learning_rate": 4.015563527416596e-06,
+ "loss": 0.0004,
+ "step": 220
+ },
+ {
+ "epoch": 2.57421875,
+ "grad_norm": 0.36212408542633057,
+ "learning_rate": 4.000284513094342e-06,
+ "loss": 0.0003,
+ "step": 221
+ },
+ {
+ "epoch": 2.5859375,
+ "grad_norm": 0.13945375382900238,
+ "learning_rate": 3.984917413452721e-06,
+ "loss": 0.0001,
+ "step": 222
+ },
+ {
+ "epoch": 2.59765625,
+ "grad_norm": 0.06798060238361359,
+ "learning_rate": 3.969463130731183e-06,
+ "loss": 0.0007,
+ "step": 223
+ },
+ {
+ "epoch": 2.609375,
+ "grad_norm": 0.19848179817199707,
+ "learning_rate": 3.953922572287915e-06,
+ "loss": 0.0007,
+ "step": 224
+ },
+ {
+ "epoch": 2.62109375,
+ "grad_norm": 0.5454645156860352,
+ "learning_rate": 3.938296650546552e-06,
+ "loss": 0.0018,
+ "step": 225
+ },
+ {
+ "epoch": 2.6328125,
+ "grad_norm": 0.22043731808662415,
+ "learning_rate": 3.9225862829426184e-06,
+ "loss": 0.0036,
+ "step": 226
+ },
+ {
+ "epoch": 2.64453125,
+ "grad_norm": 0.3086087107658386,
+ "learning_rate": 3.906792391869657e-06,
+ "loss": 0.0002,
+ "step": 227
+ },
+ {
+ "epoch": 2.65625,
+ "grad_norm": 0.04387599974870682,
+ "learning_rate": 3.890915904625075e-06,
+ "loss": 0.0014,
+ "step": 228
+ },
+ {
+ "epoch": 2.66796875,
+ "grad_norm": 0.3786030113697052,
+ "learning_rate": 3.874957753355701e-06,
+ "loss": 0.0014,
+ "step": 229
+ },
+ {
+ "epoch": 2.6796875,
+ "grad_norm": 0.28310713171958923,
+ "learning_rate": 3.858918875003053e-06,
+ "loss": 0.0001,
+ "step": 230
+ },
+ {
+ "epoch": 2.69140625,
+ "grad_norm": 0.0586460717022419,
+ "learning_rate": 3.842800211248333e-06,
+ "loss": 0.0001,
+ "step": 231
+ },
+ {
+ "epoch": 2.703125,
+ "grad_norm": 0.11408677697181702,
+ "learning_rate": 3.8266027084571335e-06,
+ "loss": 0.001,
+ "step": 232
+ },
+ {
+ "epoch": 2.71484375,
+ "grad_norm": 0.06875021010637283,
+ "learning_rate": 3.810327317623881e-06,
+ "loss": 0.0001,
+ "step": 233
+ },
+ {
+ "epoch": 2.7265625,
+ "grad_norm": 0.037388525903224945,
+ "learning_rate": 3.793974994315991e-06,
+ "loss": 0.0002,
+ "step": 234
+ },
+ {
+ "epoch": 2.73828125,
+ "grad_norm": 0.041430581361055374,
+ "learning_rate": 3.7775466986177763e-06,
+ "loss": 0.0015,
+ "step": 235
+ },
+ {
+ "epoch": 2.75,
+ "grad_norm": 0.26019373536109924,
+ "learning_rate": 3.7610433950740667e-06,
+ "loss": 0.0022,
+ "step": 236
+ },
+ {
+ "epoch": 2.76171875,
+ "grad_norm": 0.16638831794261932,
+ "learning_rate": 3.7444660526335853e-06,
+ "loss": 0.0001,
+ "step": 237
+ },
+ {
+ "epoch": 2.7734375,
+ "grad_norm": 0.11822371184825897,
+ "learning_rate": 3.7278156445920584e-06,
+ "loss": 0.0004,
+ "step": 238
+ },
+ {
+ "epoch": 2.78515625,
+ "grad_norm": 0.055076126009225845,
+ "learning_rate": 3.711093148535068e-06,
+ "loss": 0.0001,
+ "step": 239
+ },
+ {
+ "epoch": 2.796875,
+ "grad_norm": 0.08209875971078873,
+ "learning_rate": 3.6942995462806574e-06,
+ "loss": 0.0012,
+ "step": 240
+ },
+ {
+ "epoch": 2.80859375,
+ "grad_norm": 0.10523220896720886,
+ "learning_rate": 3.6774358238216878e-06,
+ "loss": 0.0004,
+ "step": 241
+ },
+ {
+ "epoch": 2.8203125,
+ "grad_norm": 0.09211058169603348,
+ "learning_rate": 3.660502971267945e-06,
+ "loss": 0.0007,
+ "step": 242
+ },
+ {
+ "epoch": 2.83203125,
+ "grad_norm": 0.6209844946861267,
+ "learning_rate": 3.6435019827880093e-06,
+ "loss": 0.0004,
+ "step": 243
+ },
+ {
+ "epoch": 2.84375,
+ "grad_norm": 0.030900023877620697,
+ "learning_rate": 3.626433856550886e-06,
+ "loss": 0.0002,
+ "step": 244
+ },
+ {
+ "epoch": 2.85546875,
+ "grad_norm": 0.041130077093839645,
+ "learning_rate": 3.6092995946673996e-06,
+ "loss": 0.0003,
+ "step": 245
+ },
+ {
+ "epoch": 2.8671875,
+ "grad_norm": 0.052536819130182266,
+ "learning_rate": 3.5921002031313586e-06,
+ "loss": 0.0001,
+ "step": 246
+ },
+ {
+ "epoch": 2.87890625,
+ "grad_norm": 0.027478178963065147,
+ "learning_rate": 3.574836691760489e-06,
+ "loss": 0.0011,
+ "step": 247
+ },
+ {
+ "epoch": 2.890625,
+ "grad_norm": 0.11695867031812668,
+ "learning_rate": 3.557510074137147e-06,
+ "loss": 0.0002,
+ "step": 248
+ },
+ {
+ "epoch": 2.90234375,
+ "grad_norm": 0.08782754838466644,
+ "learning_rate": 3.540121367548811e-06,
+ "loss": 0.001,
+ "step": 249
+ },
+ {
+ "epoch": 2.9140625,
+ "grad_norm": 0.19123269617557526,
+ "learning_rate": 3.5226715929283507e-06,
+ "loss": 0.0001,
+ "step": 250
+ },
+ {
+ "epoch": 2.92578125,
+ "grad_norm": 0.020774945616722107,
+ "learning_rate": 3.505161774794085e-06,
+ "loss": 0.0006,
+ "step": 251
+ },
+ {
+ "epoch": 2.9375,
+ "grad_norm": 0.12062892317771912,
+ "learning_rate": 3.487592941189636e-06,
+ "loss": 0.0001,
+ "step": 252
+ },
+ {
+ "epoch": 2.94921875,
+ "grad_norm": 0.013076180592179298,
+ "learning_rate": 3.469966123623563e-06,
+ "loss": 0.0011,
+ "step": 253
+ },
+ {
+ "epoch": 2.9609375,
+ "grad_norm": 0.22065430879592896,
+ "learning_rate": 3.4522823570088073e-06,
+ "loss": 0.0001,
+ "step": 254
+ },
+ {
+ "epoch": 2.97265625,
+ "grad_norm": 0.027459079399704933,
+ "learning_rate": 3.434542679601922e-06,
+ "loss": 0.0003,
+ "step": 255
+ },
+ {
+ "epoch": 2.984375,
+ "grad_norm": 0.07469172775745392,
+ "learning_rate": 3.4167481329421204e-06,
+ "loss": 0.0005,
+ "step": 256
+ },
+ {
+ "epoch": 2.99609375,
+ "grad_norm": 0.544292688369751,
+ "learning_rate": 3.39889976179012e-06,
+ "loss": 0.0001,
+ "step": 257
+ },
+ {
+ "epoch": 3.0,
+ "grad_norm": 0.02610701508820057,
+ "learning_rate": 3.380998614066805e-06,
+ "loss": 0.0,
+ "step": 258
+ },
+ {
+ "epoch": 3.01171875,
+ "grad_norm": 0.016433028504252434,
+ "learning_rate": 3.363045740791698e-06,
+ "loss": 0.0,
+ "step": 259
+ },
+ {
+ "epoch": 3.0234375,
+ "grad_norm": 0.009407744742929935,
+ "learning_rate": 3.345042196021257e-06,
+ "loss": 0.0,
+ "step": 260
+ },
+ {
+ "epoch": 3.03515625,
+ "grad_norm": 0.009587760083377361,
+ "learning_rate": 3.326989036786981e-06,
+ "loss": 0.0,
+ "step": 261
+ },
+ {
+ "epoch": 3.046875,
+ "grad_norm": 0.021458568051457405,
+ "learning_rate": 3.3088873230333562e-06,
+ "loss": 0.0001,
+ "step": 262
+ },
+ {
+ "epoch": 3.05859375,
+ "grad_norm": 1.3090940713882446,
+ "learning_rate": 3.290738117555622e-06,
+ "loss": 0.0007,
+ "step": 263
+ },
+ {
+ "epoch": 3.0703125,
+ "grad_norm": 0.008000005036592484,
+ "learning_rate": 3.272542485937369e-06,
+ "loss": 0.0,
+ "step": 264
+ },
+ {
+ "epoch": 3.08203125,
+ "grad_norm": 0.11048968136310577,
+ "learning_rate": 3.2543014964879814e-06,
+ "loss": 0.0004,
+ "step": 265
+ },
+ {
+ "epoch": 3.09375,
+ "grad_norm": 0.010688518173992634,
+ "learning_rate": 3.2360162201799085e-06,
+ "loss": 0.0,
+ "step": 266
+ },
+ {
+ "epoch": 3.10546875,
+ "grad_norm": 0.0585443377494812,
+ "learning_rate": 3.21768773058579e-06,
+ "loss": 0.0001,
+ "step": 267
+ },
+ {
+ "epoch": 3.1171875,
+ "grad_norm": 0.12098421901464462,
+ "learning_rate": 3.1993171038154203e-06,
+ "loss": 0.0002,
+ "step": 268
+ },
+ {
+ "epoch": 3.12890625,
+ "grad_norm": 0.01194986142218113,
+ "learning_rate": 3.180905418452569e-06,
+ "loss": 0.0,
+ "step": 269
+ },
+ {
+ "epoch": 3.140625,
+ "grad_norm": 0.0898946076631546,
+ "learning_rate": 3.162453755491655e-06,
+ "loss": 0.0011,
+ "step": 270
+ },
+ {
+ "epoch": 3.15234375,
+ "grad_norm": 0.04248907417058945,
+ "learning_rate": 3.143963198274278e-06,
+ "loss": 0.0001,
+ "step": 271
+ },
+ {
+ "epoch": 3.1640625,
+ "grad_norm": 0.11775418370962143,
+ "learning_rate": 3.125434832425613e-06,
+ "loss": 0.0002,
+ "step": 272
+ },
+ {
+ "epoch": 3.17578125,
+ "grad_norm": 0.009955376386642456,
+ "learning_rate": 3.1068697457906736e-06,
+ "loss": 0.0,
+ "step": 273
+ },
+ {
+ "epoch": 3.1875,
+ "grad_norm": 0.010195266455411911,
+ "learning_rate": 3.0882690283704355e-06,
+ "loss": 0.0,
+ "step": 274
+ },
+ {
+ "epoch": 3.19921875,
+ "grad_norm": 0.0036824019625782967,
+ "learning_rate": 3.0696337722578444e-06,
+ "loss": 0.0,
+ "step": 275
+ },
+ {
+ "epoch": 3.2109375,
+ "grad_norm": 0.004132798407226801,
+ "learning_rate": 3.0509650715736977e-06,
+ "loss": 0.0,
+ "step": 276
+ },
+ {
+ "epoch": 3.22265625,
+ "grad_norm": 0.0651523619890213,
+ "learning_rate": 3.0322640224024024e-06,
+ "loss": 0.0001,
+ "step": 277
+ },
+ {
+ "epoch": 3.234375,
+ "grad_norm": 0.015174048021435738,
+ "learning_rate": 3.0135317227276247e-06,
+ "loss": 0.0,
+ "step": 278
+ },
+ {
+ "epoch": 3.24609375,
+ "grad_norm": 0.004420771263539791,
+ "learning_rate": 2.994769272367822e-06,
+ "loss": 0.0,
+ "step": 279
+ },
+ {
+ "epoch": 3.2578125,
+ "grad_norm": 0.019537663087248802,
+ "learning_rate": 2.975977772911671e-06,
+ "loss": 0.0001,
+ "step": 280
+ },
+ {
+ "epoch": 3.26953125,
+ "grad_norm": 0.005312444642186165,
+ "learning_rate": 2.9571583276533923e-06,
+ "loss": 0.0,
+ "step": 281
+ },
+ {
+ "epoch": 3.28125,
+ "grad_norm": 0.005001228302717209,
+ "learning_rate": 2.93831204152797e-06,
+ "loss": 0.0,
+ "step": 282
+ },
+ {
+ "epoch": 3.29296875,
+ "grad_norm": 0.02515912428498268,
+ "learning_rate": 2.9194400210462808e-06,
+ "loss": 0.0,
+ "step": 283
+ },
+ {
+ "epoch": 3.3046875,
+ "grad_norm": 0.0026461018715053797,
+ "learning_rate": 2.9005433742301274e-06,
+ "loss": 0.0,
+ "step": 284
+ },
+ {
+ "epoch": 3.31640625,
+ "grad_norm": 0.008561859838664532,
+ "learning_rate": 2.8816232105471864e-06,
+ "loss": 0.0,
+ "step": 285
+ },
+ {
+ "epoch": 3.328125,
+ "grad_norm": 0.0016494860174134374,
+ "learning_rate": 2.8626806408458626e-06,
+ "loss": 0.0,
+ "step": 286
+ },
+ {
+ "epoch": 3.33984375,
+ "grad_norm": 0.13021136820316315,
+ "learning_rate": 2.843716777290074e-06,
+ "loss": 0.0007,
+ "step": 287
+ },
+ {
+ "epoch": 3.3515625,
+ "grad_norm": 0.0030203904025256634,
+ "learning_rate": 2.8247327332939512e-06,
+ "loss": 0.0,
+ "step": 288
+ },
+ {
+ "epoch": 3.36328125,
+ "grad_norm": 0.03953886777162552,
+ "learning_rate": 2.805729623456469e-06,
+ "loss": 0.0,
+ "step": 289
+ },
+ {
+ "epoch": 3.375,
+ "grad_norm": 0.016400372609496117,
+ "learning_rate": 2.786708563496002e-06,
+ "loss": 0.0,
+ "step": 290
+ },
+ {
+ "epoch": 3.38671875,
+ "grad_norm": 0.0036580052692443132,
+ "learning_rate": 2.7676706701848187e-06,
+ "loss": 0.0,
+ "step": 291
+ },
+ {
+ "epoch": 3.3984375,
+ "grad_norm": 0.013516291044652462,
+ "learning_rate": 2.748617061283518e-06,
+ "loss": 0.0,
+ "step": 292
+ },
+ {
+ "epoch": 3.41015625,
+ "grad_norm": 0.0161955077201128,
+ "learning_rate": 2.7295488554753957e-06,
+ "loss": 0.0,
+ "step": 293
+ },
+ {
+ "epoch": 3.421875,
+ "grad_norm": 0.030412085354328156,
+ "learning_rate": 2.710467172300768e-06,
+ "loss": 0.0,
+ "step": 294
+ },
+ {
+ "epoch": 3.43359375,
+ "grad_norm": 0.009741670452058315,
+ "learning_rate": 2.69137313209124e-06,
+ "loss": 0.0,
+ "step": 295
+ },
+ {
+ "epoch": 3.4453125,
+ "grad_norm": 0.0022640388924628496,
+ "learning_rate": 2.672267855903927e-06,
+ "loss": 0.0,
+ "step": 296
+ },
+ {
+ "epoch": 3.45703125,
+ "grad_norm": 0.004546131007373333,
+ "learning_rate": 2.653152465455639e-06,
+ "loss": 0.0,
+ "step": 297
+ },
+ {
+ "epoch": 3.46875,
+ "grad_norm": 0.00977818388491869,
+ "learning_rate": 2.6340280830570142e-06,
+ "loss": 0.0,
+ "step": 298
+ },
+ {
+ "epoch": 3.48046875,
+ "grad_norm": 0.00292399013414979,
+ "learning_rate": 2.614895831546633e-06,
+ "loss": 0.0,
+ "step": 299
+ },
+ {
+ "epoch": 3.4921875,
+ "grad_norm": 0.02362428605556488,
+ "learning_rate": 2.595756834225089e-06,
+ "loss": 0.0001,
+ "step": 300
+ },
+ {
+ "epoch": 3.50390625,
+ "grad_norm": 0.05170333385467529,
+ "learning_rate": 2.576612214789039e-06,
+ "loss": 0.0001,
+ "step": 301
+ },
+ {
+ "epoch": 3.515625,
+ "grad_norm": 0.002428271807730198,
+ "learning_rate": 2.5574630972652263e-06,
+ "loss": 0.0,
+ "step": 302
+ },
+ {
+ "epoch": 3.52734375,
+ "grad_norm": 0.0020236221607774496,
+ "learning_rate": 2.538310605944491e-06,
+ "loss": 0.0,
+ "step": 303
+ },
+ {
+ "epoch": 3.5390625,
+ "grad_norm": 0.0026413940358906984,
+ "learning_rate": 2.5191558653157542e-06,
+ "loss": 0.0,
+ "step": 304
+ },
+ {
+ "epoch": 3.55078125,
+ "grad_norm": 0.001937767956405878,
+ "learning_rate": 2.5e-06,
+ "loss": 0.0,
+ "step": 305
+ },
+ {
+ "epoch": 3.5625,
+ "grad_norm": 0.013072842732071877,
+ "learning_rate": 2.480844134684246e-06,
+ "loss": 0.0,
+ "step": 306
+ },
+ {
+ "epoch": 3.57421875,
+ "grad_norm": 0.07046481966972351,
+ "learning_rate": 2.4616893940555094e-06,
+ "loss": 0.0003,
+ "step": 307
+ },
+ {
+ "epoch": 3.5859375,
+ "grad_norm": 0.002507950412109494,
+ "learning_rate": 2.4425369027347746e-06,
+ "loss": 0.0,
+ "step": 308
+ },
+ {
+ "epoch": 3.59765625,
+ "grad_norm": 0.0024932159576565027,
+ "learning_rate": 2.423387785210962e-06,
+ "loss": 0.0,
+ "step": 309
+ },
+ {
+ "epoch": 3.609375,
+ "grad_norm": 0.007839293219149113,
+ "learning_rate": 2.404243165774912e-06,
+ "loss": 0.0,
+ "step": 310
+ },
+ {
+ "epoch": 3.62109375,
+ "grad_norm": 0.008749544620513916,
+ "learning_rate": 2.3851041684533677e-06,
+ "loss": 0.0,
+ "step": 311
+ },
+ {
+ "epoch": 3.6328125,
+ "grad_norm": 0.00224123802036047,
+ "learning_rate": 2.3659719169429866e-06,
+ "loss": 0.0,
+ "step": 312
+ },
+ {
+ "epoch": 3.64453125,
+ "grad_norm": 0.0036495248787105083,
+ "learning_rate": 2.346847534544362e-06,
+ "loss": 0.0,
+ "step": 313
+ },
+ {
+ "epoch": 3.65625,
+ "grad_norm": 0.008617470040917397,
+ "learning_rate": 2.3277321440960733e-06,
+ "loss": 0.0,
+ "step": 314
+ },
+ {
+ "epoch": 3.66796875,
+ "grad_norm": 0.20711803436279297,
+ "learning_rate": 2.308626867908761e-06,
+ "loss": 0.0004,
+ "step": 315
+ },
+ {
+ "epoch": 3.6796875,
+ "grad_norm": 0.002029536757618189,
+ "learning_rate": 2.2895328276992325e-06,
+ "loss": 0.0,
+ "step": 316
+ },
+ {
+ "epoch": 3.69140625,
+ "grad_norm": 0.0029692472890019417,
+ "learning_rate": 2.270451144524605e-06,
+ "loss": 0.0,
+ "step": 317
+ },
+ {
+ "epoch": 3.703125,
+ "grad_norm": 0.003482841420918703,
+ "learning_rate": 2.251382938716482e-06,
+ "loss": 0.0,
+ "step": 318
+ },
+ {
+ "epoch": 3.71484375,
+ "grad_norm": 0.004736272618174553,
+ "learning_rate": 2.2323293298151817e-06,
+ "loss": 0.0,
+ "step": 319
+ },
+ {
+ "epoch": 3.7265625,
+ "grad_norm": 0.002524860203266144,
+ "learning_rate": 2.2132914365039993e-06,
+ "loss": 0.0,
+ "step": 320
+ },
+ {
+ "epoch": 3.73828125,
+ "grad_norm": 0.0024032641667872667,
+ "learning_rate": 2.1942703765435317e-06,
+ "loss": 0.0,
+ "step": 321
+ },
+ {
+ "epoch": 3.75,
+ "grad_norm": 0.06402894109487534,
+ "learning_rate": 2.1752672667060488e-06,
+ "loss": 0.0002,
+ "step": 322
+ },
+ {
+ "epoch": 3.76171875,
+ "grad_norm": 0.0013841127511113882,
+ "learning_rate": 2.1562832227099266e-06,
+ "loss": 0.0,
+ "step": 323
+ },
+ {
+ "epoch": 3.7734375,
+ "grad_norm": 0.002198501257225871,
+ "learning_rate": 2.137319359154138e-06,
+ "loss": 0.0,
+ "step": 324
+ },
+ {
+ "epoch": 3.78515625,
+ "grad_norm": 0.004288461524993181,
+ "learning_rate": 2.1183767894528135e-06,
+ "loss": 0.0,
+ "step": 325
+ },
+ {
+ "epoch": 3.796875,
+ "grad_norm": 0.16602352261543274,
+ "learning_rate": 2.099456625769872e-06,
+ "loss": 0.0003,
+ "step": 326
+ },
+ {
+ "epoch": 3.80859375,
+ "grad_norm": 0.001620235969312489,
+ "learning_rate": 2.08055997895372e-06,
+ "loss": 0.0,
+ "step": 327
+ },
+ {
+ "epoch": 3.8203125,
+ "grad_norm": 0.004387021530419588,
+ "learning_rate": 2.0616879584720305e-06,
+ "loss": 0.0,
+ "step": 328
+ },
+ {
+ "epoch": 3.83203125,
+ "grad_norm": 0.040472231805324554,
+ "learning_rate": 2.042841672346608e-06,
+ "loss": 0.0001,
+ "step": 329
+ },
+ {
+ "epoch": 3.84375,
+ "grad_norm": 0.03627858683466911,
+ "learning_rate": 2.024022227088329e-06,
+ "loss": 0.0001,
+ "step": 330
+ },
+ {
+ "epoch": 3.85546875,
+ "grad_norm": 0.0029672810342162848,
+ "learning_rate": 2.0052307276321793e-06,
+ "loss": 0.0,
+ "step": 331
+ },
+ {
+ "epoch": 3.8671875,
+ "grad_norm": 0.0023526407312601805,
+ "learning_rate": 1.9864682772723757e-06,
+ "loss": 0.0,
+ "step": 332
+ },
+ {
+ "epoch": 3.87890625,
+ "grad_norm": 0.001383278169669211,
+ "learning_rate": 1.967735977597598e-06,
+ "loss": 0.0,
+ "step": 333
+ },
+ {
+ "epoch": 3.890625,
+ "grad_norm": 0.002337483922019601,
+ "learning_rate": 1.9490349284263036e-06,
+ "loss": 0.0,
+ "step": 334
+ },
+ {
+ "epoch": 3.90234375,
+ "grad_norm": 0.02629532851278782,
+ "learning_rate": 1.930366227742157e-06,
+ "loss": 0.0,
+ "step": 335
+ },
+ {
+ "epoch": 3.9140625,
+ "grad_norm": 0.03508671000599861,
+ "learning_rate": 1.9117309716295658e-06,
+ "loss": 0.0001,
+ "step": 336
+ },
+ {
+ "epoch": 3.92578125,
+ "grad_norm": 0.0021862757857888937,
+ "learning_rate": 1.8931302542093274e-06,
+ "loss": 0.0,
+ "step": 337
+ },
+ {
+ "epoch": 3.9375,
+ "grad_norm": 0.002468815306201577,
+ "learning_rate": 1.8745651675743876e-06,
+ "loss": 0.0,
+ "step": 338
+ },
+ {
+ "epoch": 3.94921875,
+ "grad_norm": 0.028530335053801537,
+ "learning_rate": 1.8560368017257229e-06,
+ "loss": 0.0001,
+ "step": 339
+ },
+ {
+ "epoch": 3.9609375,
+ "grad_norm": 0.004602192435413599,
+ "learning_rate": 1.8375462445083464e-06,
+ "loss": 0.0,
+ "step": 340
+ },
+ {
+ "epoch": 3.97265625,
+ "grad_norm": 0.004955258686095476,
+ "learning_rate": 1.8190945815474323e-06,
+ "loss": 0.0,
+ "step": 341
+ },
+ {
+ "epoch": 3.984375,
+ "grad_norm": 0.0018305755220353603,
+ "learning_rate": 1.8006828961845807e-06,
+ "loss": 0.0,
+ "step": 342
+ },
+ {
+ "epoch": 3.99609375,
+ "grad_norm": 0.004913098178803921,
+ "learning_rate": 1.782312269414211e-06,
+ "loss": 0.0,
+ "step": 343
+ },
+ {
+ "epoch": 4.0,
+ "grad_norm": 0.004913098178803921,
+ "learning_rate": 1.7639837798200923e-06,
+ "loss": 0.0,
+ "step": 344
+ },
+ {
+ "epoch": 4.01171875,
+ "grad_norm": 0.004227044992148876,
+ "learning_rate": 1.7456985035120194e-06,
+ "loss": 0.0,
+ "step": 345
+ },
+ {
+ "epoch": 4.0234375,
+ "grad_norm": 0.0020636608824133873,
+ "learning_rate": 1.7274575140626318e-06,
+ "loss": 0.0,
+ "step": 346
+ },
+ {
+ "epoch": 4.03515625,
+ "grad_norm": 0.010954855009913445,
+ "learning_rate": 1.709261882444379e-06,
+ "loss": 0.0,
+ "step": 347
+ },
+ {
+ "epoch": 4.046875,
+ "grad_norm": 0.021605566143989563,
+ "learning_rate": 1.6911126769666442e-06,
+ "loss": 0.0,
+ "step": 348
+ },
+ {
+ "epoch": 4.05859375,
+ "grad_norm": 0.003982124850153923,
+ "learning_rate": 1.6730109632130199e-06,
+ "loss": 0.0,
+ "step": 349
+ },
+ {
+ "epoch": 4.0703125,
+ "grad_norm": 0.019241735339164734,
+ "learning_rate": 1.6549578039787436e-06,
+ "loss": 0.0001,
+ "step": 350
+ },
+ {
+ "epoch": 4.08203125,
+ "grad_norm": 0.001743687316775322,
+ "learning_rate": 1.636954259208302e-06,
+ "loss": 0.0,
+ "step": 351
+ },
+ {
+ "epoch": 4.09375,
+ "grad_norm": 0.0027647230308502913,
+ "learning_rate": 1.6190013859331958e-06,
+ "loss": 0.0,
+ "step": 352
+ },
+ {
+ "epoch": 4.10546875,
+ "grad_norm": 0.001913967658765614,
+ "learning_rate": 1.6011002382098806e-06,
+ "loss": 0.0,
+ "step": 353
+ },
+ {
+ "epoch": 4.1171875,
+ "grad_norm": 0.0065271588973701,
+ "learning_rate": 1.5832518670578802e-06,
+ "loss": 0.0,
+ "step": 354
+ },
+ {
+ "epoch": 4.12890625,
+ "grad_norm": 0.0030666873790323734,
+ "learning_rate": 1.5654573203980782e-06,
+ "loss": 0.0,
+ "step": 355
+ },
+ {
+ "epoch": 4.140625,
+ "grad_norm": 0.006997556425631046,
+ "learning_rate": 1.5477176429911934e-06,
+ "loss": 0.0,
+ "step": 356
+ },
+ {
+ "epoch": 4.15234375,
+ "grad_norm": 0.0015223983209580183,
+ "learning_rate": 1.5300338763764371e-06,
+ "loss": 0.0,
+ "step": 357
+ },
+ {
+ "epoch": 4.1640625,
+ "grad_norm": 0.0016171627212315798,
+ "learning_rate": 1.5124070588103648e-06,
+ "loss": 0.0,
+ "step": 358
+ },
+ {
+ "epoch": 4.17578125,
+ "grad_norm": 0.001240705605596304,
+ "learning_rate": 1.4948382252059158e-06,
+ "loss": 0.0,
+ "step": 359
+ },
+ {
+ "epoch": 4.1875,
+ "grad_norm": 0.001194652053527534,
+ "learning_rate": 1.4773284070716504e-06,
+ "loss": 0.0,
+ "step": 360
+ },
+ {
+ "epoch": 4.19921875,
+ "grad_norm": 0.0016382395988330245,
+ "learning_rate": 1.4598786324511892e-06,
+ "loss": 0.0,
+ "step": 361
+ },
+ {
+ "epoch": 4.2109375,
+ "grad_norm": 0.004216539673507214,
+ "learning_rate": 1.4424899258628533e-06,
+ "loss": 0.0,
+ "step": 362
+ },
+ {
+ "epoch": 4.22265625,
+ "grad_norm": 0.0015016852412372828,
+ "learning_rate": 1.4251633082395117e-06,
+ "loss": 0.0,
+ "step": 363
+ },
+ {
+ "epoch": 4.234375,
+ "grad_norm": 0.002159053459763527,
+ "learning_rate": 1.4078997968686425e-06,
+ "loss": 0.0,
+ "step": 364
+ },
+ {
+ "epoch": 4.24609375,
+ "grad_norm": 0.0026948200538754463,
+ "learning_rate": 1.3907004053326006e-06,
+ "loss": 0.0,
+ "step": 365
+ },
+ {
+ "epoch": 4.2578125,
+ "grad_norm": 0.0025678593665361404,
+ "learning_rate": 1.373566143449115e-06,
+ "loss": 0.0,
+ "step": 366
+ },
+ {
+ "epoch": 4.26953125,
+ "grad_norm": 0.0020545010920614004,
+ "learning_rate": 1.3564980172119913e-06,
+ "loss": 0.0,
+ "step": 367
+ },
+ {
+ "epoch": 4.28125,
+ "grad_norm": 0.004045852459967136,
+ "learning_rate": 1.3394970287320553e-06,
+ "loss": 0.0,
+ "step": 368
+ },
+ {
+ "epoch": 4.29296875,
+ "grad_norm": 0.005362195894122124,
+ "learning_rate": 1.3225641761783126e-06,
+ "loss": 0.0,
+ "step": 369
+ },
+ {
+ "epoch": 4.3046875,
+ "grad_norm": 0.17514361441135406,
+ "learning_rate": 1.3057004537193424e-06,
+ "loss": 0.0002,
+ "step": 370
+ },
+ {
+ "epoch": 4.31640625,
+ "grad_norm": 0.002735719783231616,
+ "learning_rate": 1.2889068514649328e-06,
+ "loss": 0.0,
+ "step": 371
+ },
+ {
+ "epoch": 4.328125,
+ "grad_norm": 0.00350527698174119,
+ "learning_rate": 1.2721843554079418e-06,
+ "loss": 0.0,
+ "step": 372
+ },
+ {
+ "epoch": 4.33984375,
+ "grad_norm": 0.0011345328530296683,
+ "learning_rate": 1.2555339473664151e-06,
+ "loss": 0.0,
+ "step": 373
+ },
+ {
+ "epoch": 4.3515625,
+ "grad_norm": 0.01445677224546671,
+ "learning_rate": 1.238956604925934e-06,
+ "loss": 0.0,
+ "step": 374
+ },
+ {
+ "epoch": 4.36328125,
+ "grad_norm": 0.026896534487605095,
+ "learning_rate": 1.2224533013822237e-06,
+ "loss": 0.0,
+ "step": 375
+ },
+ {
+ "epoch": 4.375,
+ "grad_norm": 0.0032852741423994303,
+ "learning_rate": 1.206025005684009e-06,
+ "loss": 0.0,
+ "step": 376
+ },
+ {
+ "epoch": 4.38671875,
+ "grad_norm": 0.0014451753813773394,
+ "learning_rate": 1.1896726823761195e-06,
+ "loss": 0.0,
+ "step": 377
+ },
+ {
+ "epoch": 4.3984375,
+ "grad_norm": 0.002901519648730755,
+ "learning_rate": 1.1733972915428665e-06,
+ "loss": 0.0,
+ "step": 378
+ },
+ {
+ "epoch": 4.41015625,
+ "grad_norm": 0.001758516882546246,
+ "learning_rate": 1.1571997887516672e-06,
+ "loss": 0.0,
+ "step": 379
+ },
+ {
+ "epoch": 4.421875,
+ "grad_norm": 0.001257935306057334,
+ "learning_rate": 1.1410811249969475e-06,
+ "loss": 0.0,
+ "step": 380
+ },
+ {
+ "epoch": 4.43359375,
+ "grad_norm": 0.0016046202508732677,
+ "learning_rate": 1.1250422466442992e-06,
+ "loss": 0.0,
+ "step": 381
+ },
+ {
+ "epoch": 4.4453125,
+ "grad_norm": 0.0011374271707609296,
+ "learning_rate": 1.1090840953749253e-06,
+ "loss": 0.0,
+ "step": 382
+ },
+ {
+ "epoch": 4.45703125,
+ "grad_norm": 0.0027848149184137583,
+ "learning_rate": 1.0932076081303442e-06,
+ "loss": 0.0,
+ "step": 383
+ },
+ {
+ "epoch": 4.46875,
+ "grad_norm": 0.00223803473636508,
+ "learning_rate": 1.0774137170573826e-06,
+ "loss": 0.0,
+ "step": 384
+ },
+ {
+ "epoch": 4.48046875,
+ "grad_norm": 0.0018013437511399388,
+ "learning_rate": 1.0617033494534486e-06,
+ "loss": 0.0,
+ "step": 385
+ },
+ {
+ "epoch": 4.4921875,
+ "grad_norm": 0.0027079912833869457,
+ "learning_rate": 1.0460774277120866e-06,
+ "loss": 0.0,
+ "step": 386
+ },
+ {
+ "epoch": 4.50390625,
+ "grad_norm": 0.002311108633875847,
+ "learning_rate": 1.0305368692688175e-06,
+ "loss": 0.0,
+ "step": 387
+ },
+ {
+ "epoch": 4.515625,
+ "grad_norm": 0.001729196636006236,
+ "learning_rate": 1.0150825865472813e-06,
+ "loss": 0.0,
+ "step": 388
+ },
+ {
+ "epoch": 4.52734375,
+ "grad_norm": 0.002961450256407261,
+ "learning_rate": 9.997154869056588e-07,
+ "loss": 0.0,
+ "step": 389
+ },
+ {
+ "epoch": 4.5390625,
+ "grad_norm": 0.002972877351567149,
+ "learning_rate": 9.844364725834058e-07,
+ "loss": 0.0,
+ "step": 390
+ },
+ {
+ "epoch": 4.55078125,
+ "grad_norm": 0.0008791300351731479,
+ "learning_rate": 9.692464406482727e-07,
+ "loss": 0.0,
+ "step": 391
+ },
+ {
+ "epoch": 4.5625,
+ "grad_norm": 0.0018361720722168684,
+ "learning_rate": 9.541462829436426e-07,
+ "loss": 0.0,
+ "step": 392
+ },
+ {
+ "epoch": 4.57421875,
+ "grad_norm": 0.0029881680384278297,
+ "learning_rate": 9.39136886036166e-07,
+ "loss": 0.0,
+ "step": 393
+ },
+ {
+ "epoch": 4.5859375,
+ "grad_norm": 0.0030923946760594845,
+ "learning_rate": 9.24219131163705e-07,
+ "loss": 0.0,
+ "step": 394
+ },
+ {
+ "epoch": 4.59765625,
+ "grad_norm": 0.0014424376422539353,
+ "learning_rate": 9.093938941836012e-07,
+ "loss": 0.0,
+ "step": 395
+ },
+ {
+ "epoch": 4.609375,
+ "grad_norm": 0.0018437248654663563,
+ "learning_rate": 8.946620455212438e-07,
+ "loss": 0.0,
+ "step": 396
+ },
+ {
+ "epoch": 4.62109375,
+ "grad_norm": 0.0035209229681640863,
+ "learning_rate": 8.800244501189722e-07,
+ "loss": 0.0003,
+ "step": 397
+ },
+ {
+ "epoch": 4.6328125,
+ "grad_norm": 0.19659285247325897,
+ "learning_rate": 8.654819673852874e-07,
+ "loss": 0.0,
+ "step": 398
+ },
+ {
+ "epoch": 4.64453125,
+ "grad_norm": 0.17559310793876648,
+ "learning_rate": 8.510354511443975e-07,
+ "loss": 0.0003,
+ "step": 399
+ },
+ {
+ "epoch": 4.65625,
+ "grad_norm": 0.0017143903532996774,
+ "learning_rate": 8.366857495860869e-07,
+ "loss": 0.0,
+ "step": 400
+ },
+ {
+ "epoch": 4.66796875,
+ "grad_norm": 0.008345520123839378,
+ "learning_rate": 8.224337052159154e-07,
+ "loss": 0.0,
+ "step": 401
+ },
+ {
+ "epoch": 4.6796875,
+ "grad_norm": 0.001156082609668374,
+ "learning_rate": 8.082801548057553e-07,
+ "loss": 0.0,
+ "step": 402
+ },
+ {
+ "epoch": 4.69140625,
+ "grad_norm": 0.0014560276176780462,
+ "learning_rate": 7.942259293446594e-07,
+ "loss": 0.0,
+ "step": 403
+ },
+ {
+ "epoch": 4.703125,
+ "grad_norm": 0.0013030421687290072,
+ "learning_rate": 7.802718539900761e-07,
+ "loss": 0.0,
+ "step": 404
+ },
+ {
+ "epoch": 4.71484375,
+ "grad_norm": 0.0012356194201856852,
+ "learning_rate": 7.66418748019396e-07,
+ "loss": 0.0,
+ "step": 405
+ },
+ {
+ "epoch": 4.7265625,
+ "grad_norm": 0.004214293789118528,
+ "learning_rate": 7.526674247818569e-07,
+ "loss": 0.0,
+ "step": 406
+ },
+ {
+ "epoch": 4.73828125,
+ "grad_norm": 0.003626940306276083,
+ "learning_rate": 7.390186916507869e-07,
+ "loss": 0.0,
+ "step": 407
+ },
+ {
+ "epoch": 4.75,
+ "grad_norm": 0.003801507642492652,
+ "learning_rate": 7.254733499761993e-07,
+ "loss": 0.0,
+ "step": 408
+ },
+ {
+ "epoch": 4.76171875,
+ "grad_norm": 0.0023032291792333126,
+ "learning_rate": 7.120321950377487e-07,
+ "loss": 0.0,
+ "step": 409
+ },
+ {
+ "epoch": 4.7734375,
+ "grad_norm": 0.0018953473772853613,
+ "learning_rate": 6.986960159980327e-07,
+ "loss": 0.0,
+ "step": 410
+ },
+ {
+ "epoch": 4.78515625,
+ "grad_norm": 0.0011394222965463996,
+ "learning_rate": 6.854655958562625e-07,
+ "loss": 0.0,
+ "step": 411
+ },
+ {
+ "epoch": 4.796875,
+ "grad_norm": 0.0021377848461270332,
+ "learning_rate": 6.723417114022907e-07,
+ "loss": 0.0,
+ "step": 412
+ },
+ {
+ "epoch": 4.80859375,
+ "grad_norm": 0.0011264781933277845,
+ "learning_rate": 6.593251331709993e-07,
+ "loss": 0.0,
+ "step": 413
+ },
+ {
+ "epoch": 4.8203125,
+ "grad_norm": 0.004995762836188078,
+ "learning_rate": 6.464166253970672e-07,
+ "loss": 0.0,
+ "step": 414
+ },
+ {
+ "epoch": 4.83203125,
+ "grad_norm": 0.0014515212969854474,
+ "learning_rate": 6.336169459700933e-07,
+ "loss": 0.0,
+ "step": 415
+ },
+ {
+ "epoch": 4.84375,
+ "grad_norm": 0.000913277908693999,
+ "learning_rate": 6.209268463901047e-07,
+ "loss": 0.0,
+ "step": 416
+ },
+ {
+ "epoch": 4.85546875,
+ "grad_norm": 0.010075507685542107,
+ "learning_rate": 6.083470717234285e-07,
+ "loss": 0.0,
+ "step": 417
+ },
+ {
+ "epoch": 4.8671875,
+ "grad_norm": 0.0015437327092513442,
+ "learning_rate": 5.95878360558953e-07,
+ "loss": 0.0,
+ "step": 418
+ },
+ {
+ "epoch": 4.87890625,
+ "grad_norm": 0.0008694503339938819,
+ "learning_rate": 5.835214449647602e-07,
+ "loss": 0.0,
+ "step": 419
+ },
+ {
+ "epoch": 4.890625,
+ "grad_norm": 0.003764442168176174,
+ "learning_rate": 5.712770504451426e-07,
+ "loss": 0.0,
+ "step": 420
+ },
+ {
+ "epoch": 4.90234375,
+ "grad_norm": 0.0019374670227989554,
+ "learning_rate": 5.591458958980123e-07,
+ "loss": 0.0,
+ "step": 421
+ },
+ {
+ "epoch": 4.9140625,
+ "grad_norm": 0.00113675557076931,
+ "learning_rate": 5.471286935726866e-07,
+ "loss": 0.0,
+ "step": 422
+ },
+ {
+ "epoch": 4.92578125,
+ "grad_norm": 0.001957179745659232,
+ "learning_rate": 5.352261490280767e-07,
+ "loss": 0.0,
+ "step": 423
+ },
+ {
+ "epoch": 4.9375,
+ "grad_norm": 0.00735822319984436,
+ "learning_rate": 5.234389610912552e-07,
+ "loss": 0.0,
+ "step": 424
+ },
+ {
+ "epoch": 4.94921875,
+ "grad_norm": 0.0010691111674532294,
+ "learning_rate": 5.117678218164337e-07,
+ "loss": 0.0,
+ "step": 425
+ }
+ ],
+ "logging_steps": 1,
+ "max_steps": 510,
+ "num_input_tokens_seen": 0,
+ "num_train_epochs": 6,
+ "save_steps": 85,
+ "stateful_callbacks": {
+ "TrainerControl": {
+ "args": {
+ "should_epoch_stop": false,
+ "should_evaluate": false,
+ "should_log": false,
+ "should_save": true,
+ "should_training_stop": false
+ },
+ "attributes": {}
+ }
+ },
+ "total_flos": 1.0668294982651085e+18,
+ "train_batch_size": 4,
+ "trial_name": null,
+ "trial_params": null
+}
diff --git a/checkpoint-425/training_args.bin b/checkpoint-425/training_args.bin
new file mode 100644
index 0000000000000000000000000000000000000000..31435c2b54979c306fa2a089f64bc8d21e1d21cf
--- /dev/null
+++ b/checkpoint-425/training_args.bin
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:ae0e02a237d0ed5071f0d2c656d0cc6fa0293647ec7cffc6f8d299311f592cdc
+size 8056
diff --git a/checkpoint-425/zero_to_fp32.py b/checkpoint-425/zero_to_fp32.py
new file mode 100644
index 0000000000000000000000000000000000000000..24cc342e78d1a006c782b3a4cd68d9ce786d8fd8
--- /dev/null
+++ b/checkpoint-425/zero_to_fp32.py
@@ -0,0 +1,604 @@
+#!/usr/bin/env python
+
+# Copyright (c) Microsoft Corporation.
+# SPDX-License-Identifier: Apache-2.0
+
+# DeepSpeed Team
+
+# This script extracts fp32 consolidated weights from a zero 1, 2 and 3 DeepSpeed checkpoints. It gets
+# copied into the top level checkpoint dir, so the user can easily do the conversion at any point in
+# the future. Once extracted, the weights don't require DeepSpeed and can be used in any
+# application.
+#
+# example: python zero_to_fp32.py . pytorch_model.bin
+
+import argparse
+import torch
+import glob
+import math
+import os
+import re
+from collections import OrderedDict
+from dataclasses import dataclass
+
+# while this script doesn't use deepspeed to recover data, since the checkpoints are pickled with
+# DeepSpeed data structures it has to be available in the current python environment.
+from deepspeed.utils import logger
+from deepspeed.checkpoint.constants import (DS_VERSION, OPTIMIZER_STATE_DICT, SINGLE_PARTITION_OF_FP32_GROUPS,
+ FP32_FLAT_GROUPS, ZERO_STAGE, PARTITION_COUNT, PARAM_SHAPES, BUFFER_NAMES,
+ FROZEN_PARAM_SHAPES, FROZEN_PARAM_FRAGMENTS)
+
+
+@dataclass
+class zero_model_state:
+ buffers: dict()
+ param_shapes: dict()
+ shared_params: list
+ ds_version: int
+ frozen_param_shapes: dict()
+ frozen_param_fragments: dict()
+
+
+debug = 0
+
+# load to cpu
+device = torch.device('cpu')
+
+
+def atoi(text):
+ return int(text) if text.isdigit() else text
+
+
+def natural_keys(text):
+ '''
+ alist.sort(key=natural_keys) sorts in human order
+ http://nedbatchelder.com/blog/200712/human_sorting.html
+ (See Toothy's implementation in the comments)
+ '''
+ return [atoi(c) for c in re.split(r'(\d+)', text)]
+
+
+def get_model_state_file(checkpoint_dir, zero_stage):
+ if not os.path.isdir(checkpoint_dir):
+ raise FileNotFoundError(f"Directory '{checkpoint_dir}' doesn't exist")
+
+ # there should be only one file
+ if zero_stage <= 2:
+ file = os.path.join(checkpoint_dir, "mp_rank_00_model_states.pt")
+ elif zero_stage == 3:
+ file = os.path.join(checkpoint_dir, "zero_pp_rank_0_mp_rank_00_model_states.pt")
+
+ if not os.path.exists(file):
+ raise FileNotFoundError(f"can't find model states file at '{file}'")
+
+ return file
+
+
+def get_checkpoint_files(checkpoint_dir, glob_pattern):
+ # XXX: need to test that this simple glob rule works for multi-node setup too
+ ckpt_files = sorted(glob.glob(os.path.join(checkpoint_dir, glob_pattern)), key=natural_keys)
+
+ if len(ckpt_files) == 0:
+ raise FileNotFoundError(f"can't find {glob_pattern} files in directory '{checkpoint_dir}'")
+
+ return ckpt_files
+
+
+def get_optim_files(checkpoint_dir):
+ return get_checkpoint_files(checkpoint_dir, "*_optim_states.pt")
+
+
+def get_model_state_files(checkpoint_dir):
+ return get_checkpoint_files(checkpoint_dir, "*_model_states.pt")
+
+
+def parse_model_states(files):
+ zero_model_states = []
+ for file in files:
+ state_dict = torch.load(file, map_location=device)
+
+ if BUFFER_NAMES not in state_dict:
+ raise ValueError(f"{file} is not a model state checkpoint")
+ buffer_names = state_dict[BUFFER_NAMES]
+ if debug:
+ print("Found buffers:", buffer_names)
+
+ # recover just the buffers while restoring them to fp32 if they were saved in fp16
+ buffers = {k: v.float() for k, v in state_dict["module"].items() if k in buffer_names}
+ param_shapes = state_dict[PARAM_SHAPES]
+
+ # collect parameters that are included in param_shapes
+ param_names = []
+ for s in param_shapes:
+ for name in s.keys():
+ param_names.append(name)
+
+ # update with frozen parameters
+ frozen_param_shapes = state_dict.get(FROZEN_PARAM_SHAPES, None)
+ if frozen_param_shapes is not None:
+ if debug:
+ print(f"Found frozen_param_shapes: {frozen_param_shapes}")
+ param_names += list(frozen_param_shapes.keys())
+
+ # handle shared params
+ shared_params = [[k, v] for k, v in state_dict["shared_params"].items()]
+
+ ds_version = state_dict.get(DS_VERSION, None)
+
+ frozen_param_fragments = state_dict.get(FROZEN_PARAM_FRAGMENTS, None)
+
+ z_model_state = zero_model_state(buffers=buffers,
+ param_shapes=param_shapes,
+ shared_params=shared_params,
+ ds_version=ds_version,
+ frozen_param_shapes=frozen_param_shapes,
+ frozen_param_fragments=frozen_param_fragments)
+ zero_model_states.append(z_model_state)
+
+ return zero_model_states
+
+
+def parse_optim_states(files, ds_checkpoint_dir):
+
+ total_files = len(files)
+ state_dicts = []
+ for f in files:
+ state_dict = torch.load(f, map_location=device)
+ # immediately discard the potentially huge 2 optimizer states as we only care for fp32 master weights
+ # and also handle the case where it was already removed by another helper script
+ state_dict["optimizer_state_dict"].pop("optimizer_state_dict", None)
+ state_dicts.append(state_dict)
+
+ if not ZERO_STAGE in state_dicts[0][OPTIMIZER_STATE_DICT]:
+ raise ValueError(f"{files[0]} is not a zero checkpoint")
+ zero_stage = state_dicts[0][OPTIMIZER_STATE_DICT][ZERO_STAGE]
+ world_size = state_dicts[0][OPTIMIZER_STATE_DICT][PARTITION_COUNT]
+
+ # For ZeRO-2 each param group can have different partition_count as data parallelism for expert
+ # parameters can be different from data parallelism for non-expert parameters. So we can just
+ # use the max of the partition_count to get the dp world_size.
+
+ if type(world_size) is list:
+ world_size = max(world_size)
+
+ if world_size != total_files:
+ raise ValueError(
+ f"Expected {world_size} of '*_optim_states.pt' under '{ds_checkpoint_dir}' but found {total_files} files. "
+ "Possibly due to an overwrite of an old checkpoint, or a checkpoint didn't get saved by one or more processes."
+ )
+
+ # the groups are named differently in each stage
+ if zero_stage <= 2:
+ fp32_groups_key = SINGLE_PARTITION_OF_FP32_GROUPS
+ elif zero_stage == 3:
+ fp32_groups_key = FP32_FLAT_GROUPS
+ else:
+ raise ValueError(f"unknown zero stage {zero_stage}")
+
+ if zero_stage <= 2:
+ fp32_flat_groups = [state_dicts[i][OPTIMIZER_STATE_DICT][fp32_groups_key] for i in range(len(state_dicts))]
+ elif zero_stage == 3:
+ # if there is more than one param group, there will be multiple flattened tensors - one
+ # flattened tensor per group - for simplicity merge them into a single tensor
+ #
+ # XXX: could make the script more memory efficient for when there are multiple groups - it
+ # will require matching the sub-lists of param_shapes for each param group flattened tensor
+
+ fp32_flat_groups = [
+ torch.cat(state_dicts[i][OPTIMIZER_STATE_DICT][fp32_groups_key], 0) for i in range(len(state_dicts))
+ ]
+
+ return zero_stage, world_size, fp32_flat_groups
+
+
+def _get_fp32_state_dict_from_zero_checkpoint(ds_checkpoint_dir, exclude_frozen_parameters):
+ """
+ Returns fp32 state_dict reconstructed from ds checkpoint
+
+ Args:
+ - ``ds_checkpoint_dir``: path to the deepspeed checkpoint folder (where the optimizer files are)
+
+ """
+ print(f"Processing zero checkpoint '{ds_checkpoint_dir}'")
+
+ optim_files = get_optim_files(ds_checkpoint_dir)
+ zero_stage, world_size, fp32_flat_groups = parse_optim_states(optim_files, ds_checkpoint_dir)
+ print(f"Detected checkpoint of type zero stage {zero_stage}, world_size: {world_size}")
+
+ model_files = get_model_state_files(ds_checkpoint_dir)
+
+ zero_model_states = parse_model_states(model_files)
+ print(f'Parsing checkpoint created by deepspeed=={zero_model_states[0].ds_version}')
+
+ if zero_stage <= 2:
+ return _get_fp32_state_dict_from_zero2_checkpoint(world_size, fp32_flat_groups, zero_model_states,
+ exclude_frozen_parameters)
+ elif zero_stage == 3:
+ return _get_fp32_state_dict_from_zero3_checkpoint(world_size, fp32_flat_groups, zero_model_states,
+ exclude_frozen_parameters)
+
+
+def _zero2_merge_frozen_params(state_dict, zero_model_states):
+ if zero_model_states[0].frozen_param_shapes is None or len(zero_model_states[0].frozen_param_shapes) == 0:
+ return
+
+ frozen_param_shapes = zero_model_states[0].frozen_param_shapes
+ frozen_param_fragments = zero_model_states[0].frozen_param_fragments
+
+ if debug:
+ num_elem = sum(s.numel() for s in frozen_param_shapes.values())
+ print(f'rank 0: {FROZEN_PARAM_SHAPES}.numel = {num_elem}')
+
+ wanted_params = len(frozen_param_shapes)
+ wanted_numel = sum(s.numel() for s in frozen_param_shapes.values())
+ avail_numel = sum([p.numel() for p in frozen_param_fragments.values()])
+ print(f'Frozen params: Have {avail_numel} numels to process.')
+ print(f'Frozen params: Need {wanted_numel} numels in {wanted_params} params')
+
+ total_params = 0
+ total_numel = 0
+ for name, shape in frozen_param_shapes.items():
+ total_params += 1
+ unpartitioned_numel = shape.numel()
+ total_numel += unpartitioned_numel
+
+ state_dict[name] = frozen_param_fragments[name]
+
+ if debug:
+ print(f"{name} full shape: {shape} unpartitioned numel {unpartitioned_numel} ")
+
+ print(f"Reconstructed Frozen fp32 state dict with {total_params} params {total_numel} elements")
+
+
+def _has_callable(obj, fn):
+ attr = getattr(obj, fn, None)
+ return callable(attr)
+
+
+def _zero2_merge_trainable_params(state_dict, world_size, fp32_flat_groups, zero_model_states):
+ param_shapes = zero_model_states[0].param_shapes
+
+ # Reconstruction protocol:
+ #
+ # XXX: document this
+
+ if debug:
+ for i in range(world_size):
+ for j in range(len(fp32_flat_groups[0])):
+ print(f"{FP32_FLAT_GROUPS}[{i}][{j}].shape={fp32_flat_groups[i][j].shape}")
+
+ # XXX: memory usage doubles here (zero2)
+ num_param_groups = len(fp32_flat_groups[0])
+ merged_single_partition_of_fp32_groups = []
+ for i in range(num_param_groups):
+ merged_partitions = [sd[i] for sd in fp32_flat_groups]
+ full_single_fp32_vector = torch.cat(merged_partitions, 0)
+ merged_single_partition_of_fp32_groups.append(full_single_fp32_vector)
+ avail_numel = sum(
+ [full_single_fp32_vector.numel() for full_single_fp32_vector in merged_single_partition_of_fp32_groups])
+
+ if debug:
+ wanted_params = sum([len(shapes) for shapes in param_shapes])
+ wanted_numel = sum([sum(shape.numel() for shape in shapes.values()) for shapes in param_shapes])
+ # not asserting if there is a mismatch due to possible padding
+ print(f"Have {avail_numel} numels to process.")
+ print(f"Need {wanted_numel} numels in {wanted_params} params.")
+
+ # params
+ # XXX: for huge models that can't fit into the host's RAM we will have to recode this to support
+ # out-of-core computing solution
+ total_numel = 0
+ total_params = 0
+ for shapes, full_single_fp32_vector in zip(param_shapes, merged_single_partition_of_fp32_groups):
+ offset = 0
+ avail_numel = full_single_fp32_vector.numel()
+ for name, shape in shapes.items():
+
+ unpartitioned_numel = shape.numel() if _has_callable(shape, 'numel') else math.prod(shape)
+ total_numel += unpartitioned_numel
+ total_params += 1
+
+ if debug:
+ print(f"{name} full shape: {shape} unpartitioned numel {unpartitioned_numel} ")
+ state_dict[name] = full_single_fp32_vector.narrow(0, offset, unpartitioned_numel).view(shape)
+ offset += unpartitioned_numel
+
+ # Z2 started to align to 2*world_size to improve nccl performance. Therefore both offset and
+ # avail_numel can differ by anywhere between 0..2*world_size. Due to two unrelated complex
+ # paddings performed in the code it's almost impossible to predict the exact numbers w/o the
+ # live optimizer object, so we are checking that the numbers are within the right range
+ align_to = 2 * world_size
+
+ def zero2_align(x):
+ return align_to * math.ceil(x / align_to)
+
+ if debug:
+ print(f"original offset={offset}, avail_numel={avail_numel}")
+
+ offset = zero2_align(offset)
+ avail_numel = zero2_align(avail_numel)
+
+ if debug:
+ print(f"aligned offset={offset}, avail_numel={avail_numel}")
+
+ # Sanity check
+ if offset != avail_numel:
+ raise ValueError(f"consumed {offset} numels out of {avail_numel} - something is wrong")
+
+ print(f"Reconstructed fp32 state dict with {total_params} params {total_numel} elements")
+
+
+def _get_fp32_state_dict_from_zero2_checkpoint(world_size, fp32_flat_groups, zero_model_states,
+ exclude_frozen_parameters):
+ state_dict = OrderedDict()
+
+ # buffers
+ buffers = zero_model_states[0].buffers
+ state_dict.update(buffers)
+ if debug:
+ print(f"added {len(buffers)} buffers")
+
+ if not exclude_frozen_parameters:
+ _zero2_merge_frozen_params(state_dict, zero_model_states)
+
+ _zero2_merge_trainable_params(state_dict, world_size, fp32_flat_groups, zero_model_states)
+
+ # recover shared parameters
+ for pair in zero_model_states[0].shared_params:
+ if pair[1] in state_dict:
+ state_dict[pair[0]] = state_dict[pair[1]]
+
+ return state_dict
+
+
+def zero3_partitioned_param_info(unpartitioned_numel, world_size):
+ remainder = unpartitioned_numel % world_size
+ padding_numel = (world_size - remainder) if remainder else 0
+ partitioned_numel = math.ceil(unpartitioned_numel / world_size)
+ return partitioned_numel, padding_numel
+
+
+def _zero3_merge_frozen_params(state_dict, world_size, zero_model_states):
+ if zero_model_states[0].frozen_param_shapes is None or len(zero_model_states[0].frozen_param_shapes) == 0:
+ return
+
+ if debug:
+ for i in range(world_size):
+ num_elem = sum(s.numel() for s in zero_model_states[i].frozen_param_fragments.values())
+ print(f'rank {i}: {FROZEN_PARAM_SHAPES}.numel = {num_elem}')
+
+ frozen_param_shapes = zero_model_states[0].frozen_param_shapes
+ wanted_params = len(frozen_param_shapes)
+ wanted_numel = sum(s.numel() for s in frozen_param_shapes.values())
+ avail_numel = sum([p.numel() for p in zero_model_states[0].frozen_param_fragments.values()]) * world_size
+ print(f'Frozen params: Have {avail_numel} numels to process.')
+ print(f'Frozen params: Need {wanted_numel} numels in {wanted_params} params')
+
+ total_params = 0
+ total_numel = 0
+ for name, shape in zero_model_states[0].frozen_param_shapes.items():
+ total_params += 1
+ unpartitioned_numel = shape.numel()
+ total_numel += unpartitioned_numel
+
+ param_frags = tuple(model_state.frozen_param_fragments[name] for model_state in zero_model_states)
+ state_dict[name] = torch.cat(param_frags, 0).narrow(0, 0, unpartitioned_numel).view(shape)
+
+ partitioned_numel, partitioned_padding_numel = zero3_partitioned_param_info(unpartitioned_numel, world_size)
+
+ if debug:
+ print(
+ f"Frozen params: {total_params} {name} full shape: {shape} partition0 numel={partitioned_numel} partitioned_padding_numel={partitioned_padding_numel}"
+ )
+
+ print(f"Reconstructed Frozen fp32 state dict with {total_params} params {total_numel} elements")
+
+
+def _zero3_merge_trainable_params(state_dict, world_size, fp32_flat_groups, zero_model_states):
+ param_shapes = zero_model_states[0].param_shapes
+ avail_numel = fp32_flat_groups[0].numel() * world_size
+ # Reconstruction protocol: For zero3 we need to zip the partitions together at boundary of each
+ # param, re-consolidating each param, while dealing with padding if any
+
+ # merge list of dicts, preserving order
+ param_shapes = {k: v for d in param_shapes for k, v in d.items()}
+
+ if debug:
+ for i in range(world_size):
+ print(f"{FP32_FLAT_GROUPS}[{i}].shape={fp32_flat_groups[i].shape}")
+
+ wanted_params = len(param_shapes)
+ wanted_numel = sum(shape.numel() for shape in param_shapes.values())
+ # not asserting if there is a mismatch due to possible padding
+ avail_numel = fp32_flat_groups[0].numel() * world_size
+ print(f"Trainable params: Have {avail_numel} numels to process.")
+ print(f"Trainable params: Need {wanted_numel} numels in {wanted_params} params.")
+
+ # params
+ # XXX: for huge models that can't fit into the host's RAM we will have to recode this to support
+ # out-of-core computing solution
+ offset = 0
+ total_numel = 0
+ total_params = 0
+ for name, shape in param_shapes.items():
+
+ unpartitioned_numel = shape.numel()
+ total_numel += unpartitioned_numel
+ total_params += 1
+
+ partitioned_numel, partitioned_padding_numel = zero3_partitioned_param_info(unpartitioned_numel, world_size)
+
+ if debug:
+ print(
+ f"Trainable params: {total_params} {name} full shape: {shape} partition0 numel={partitioned_numel} partitioned_padding_numel={partitioned_padding_numel}"
+ )
+
+ # XXX: memory usage doubles here
+ state_dict[name] = torch.cat(
+ tuple(fp32_flat_groups[i].narrow(0, offset, partitioned_numel) for i in range(world_size)),
+ 0).narrow(0, 0, unpartitioned_numel).view(shape)
+ offset += partitioned_numel
+
+ offset *= world_size
+
+ # Sanity check
+ if offset != avail_numel:
+ raise ValueError(f"consumed {offset} numels out of {avail_numel} - something is wrong")
+
+ print(f"Reconstructed Trainable fp32 state dict with {total_params} params {total_numel} elements")
+
+
+def _get_fp32_state_dict_from_zero3_checkpoint(world_size, fp32_flat_groups, zero_model_states,
+ exclude_frozen_parameters):
+ state_dict = OrderedDict()
+
+ # buffers
+ buffers = zero_model_states[0].buffers
+ state_dict.update(buffers)
+ if debug:
+ print(f"added {len(buffers)} buffers")
+
+ if not exclude_frozen_parameters:
+ _zero3_merge_frozen_params(state_dict, world_size, zero_model_states)
+
+ _zero3_merge_trainable_params(state_dict, world_size, fp32_flat_groups, zero_model_states)
+
+ # recover shared parameters
+ for pair in zero_model_states[0].shared_params:
+ if pair[1] in state_dict:
+ state_dict[pair[0]] = state_dict[pair[1]]
+
+ return state_dict
+
+
+def get_fp32_state_dict_from_zero_checkpoint(checkpoint_dir, tag=None, exclude_frozen_parameters=False):
+ """
+ Convert ZeRO 2 or 3 checkpoint into a single fp32 consolidated state_dict that can be loaded with
+ ``load_state_dict()`` and used for training without DeepSpeed or shared with others, for example
+ via a model hub.
+
+ Args:
+ - ``checkpoint_dir``: path to the desired checkpoint folder
+ - ``tag``: checkpoint tag used as a unique identifier for checkpoint. If not provided will attempt to load tag in 'latest' file. e.g., ``global_step14``
+ - ``exclude_frozen_parameters``: exclude frozen parameters
+
+ Returns:
+ - pytorch ``state_dict``
+
+ Note: this approach may not work if your application doesn't have sufficient free CPU memory and
+ you may need to use the offline approach using the ``zero_to_fp32.py`` script that is saved with
+ the checkpoint.
+
+ A typical usage might be ::
+
+ from deepspeed.utils.zero_to_fp32 import get_fp32_state_dict_from_zero_checkpoint
+ # do the training and checkpoint saving
+ state_dict = get_fp32_state_dict_from_zero_checkpoint(checkpoint_dir) # already on cpu
+ model = model.cpu() # move to cpu
+ model.load_state_dict(state_dict)
+ # submit to model hub or save the model to share with others
+
+ In this example the ``model`` will no longer be usable in the deepspeed context of the same
+ application. i.e. you will need to re-initialize the deepspeed engine, since
+ ``model.load_state_dict(state_dict)`` will remove all the deepspeed magic from it.
+
+ If you want it all done for you, use ``load_state_dict_from_zero_checkpoint`` instead.
+
+ """
+ if tag is None:
+ latest_path = os.path.join(checkpoint_dir, 'latest')
+ if os.path.isfile(latest_path):
+ with open(latest_path, 'r') as fd:
+ tag = fd.read().strip()
+ else:
+ raise ValueError(f"Unable to find 'latest' file at {latest_path}")
+
+ ds_checkpoint_dir = os.path.join(checkpoint_dir, tag)
+
+ if not os.path.isdir(ds_checkpoint_dir):
+ raise FileNotFoundError(f"Directory '{ds_checkpoint_dir}' doesn't exist")
+
+ return _get_fp32_state_dict_from_zero_checkpoint(ds_checkpoint_dir, exclude_frozen_parameters)
+
+
+def convert_zero_checkpoint_to_fp32_state_dict(checkpoint_dir, output_file, tag=None, exclude_frozen_parameters=False):
+ """
+ Convert ZeRO 2 or 3 checkpoint into a single fp32 consolidated ``state_dict`` file that can be
+ loaded with ``torch.load(file)`` + ``load_state_dict()`` and used for training without DeepSpeed.
+
+ Args:
+ - ``checkpoint_dir``: path to the desired checkpoint folder. (one that contains the tag-folder, like ``global_step14``)
+ - ``output_file``: path to the pytorch fp32 state_dict output file (e.g. path/pytorch_model.bin)
+ - ``tag``: checkpoint tag used as a unique identifier for checkpoint. If not provided will attempt to load tag in the file named ``latest`` in the checkpoint folder, e.g., ``global_step14``
+ - ``exclude_frozen_parameters``: exclude frozen parameters
+ """
+
+ state_dict = get_fp32_state_dict_from_zero_checkpoint(checkpoint_dir, tag, exclude_frozen_parameters)
+ print(f"Saving fp32 state dict to {output_file}")
+ torch.save(state_dict, output_file)
+
+
+def load_state_dict_from_zero_checkpoint(model, checkpoint_dir, tag=None):
+ """
+ 1. Put the provided model to cpu
+ 2. Convert ZeRO 2 or 3 checkpoint into a single fp32 consolidated ``state_dict``
+ 3. Load it into the provided model
+
+ Args:
+ - ``model``: the model object to update
+ - ``checkpoint_dir``: path to the desired checkpoint folder. (one that contains the tag-folder, like ``global_step14``)
+ - ``tag``: checkpoint tag used as a unique identifier for checkpoint. If not provided will attempt to load tag in the file named ``latest`` in the checkpoint folder, e.g., ``global_step14``
+
+ Returns:
+ - ``model`: modified model
+
+ Make sure you have plenty of CPU memory available before you call this function. If you don't
+ have enough use the ``zero_to_fp32.py`` utility to do the conversion. You will find it
+ conveniently placed for you in the checkpoint folder.
+
+ A typical usage might be ::
+
+ from deepspeed.utils.zero_to_fp32 import load_state_dict_from_zero_checkpoint
+ model = load_state_dict_from_zero_checkpoint(trainer.model, checkpoint_dir)
+ # submit to model hub or save the model to share with others
+
+ Note, that once this was run, the ``model`` will no longer be usable in the deepspeed context
+ of the same application. i.e. you will need to re-initialize the deepspeed engine, since
+ ``model.load_state_dict(state_dict)`` will remove all the deepspeed magic from it.
+
+ """
+ logger.info(f"Extracting fp32 weights")
+ state_dict = get_fp32_state_dict_from_zero_checkpoint(checkpoint_dir, tag)
+
+ logger.info(f"Overwriting model with fp32 weights")
+ model = model.cpu()
+ model.load_state_dict(state_dict, strict=False)
+
+ return model
+
+
+if __name__ == "__main__":
+
+ parser = argparse.ArgumentParser()
+ parser.add_argument("checkpoint_dir",
+ type=str,
+ help="path to the desired checkpoint folder, e.g., path/checkpoint-12")
+ parser.add_argument(
+ "output_file",
+ type=str,
+ help="path to the pytorch fp32 state_dict output file (e.g. path/checkpoint-12/pytorch_model.bin)")
+ parser.add_argument("-t",
+ "--tag",
+ type=str,
+ default=None,
+ help="checkpoint tag used as a unique identifier for checkpoint. e.g., global_step1")
+ parser.add_argument("--exclude_frozen_parameters", action='store_true', help="exclude frozen parameters")
+ parser.add_argument("-d", "--debug", action='store_true', help="enable debug")
+ args = parser.parse_args()
+
+ debug = args.debug
+
+ convert_zero_checkpoint_to_fp32_state_dict(args.checkpoint_dir,
+ args.output_file,
+ tag=args.tag,
+ exclude_frozen_parameters=args.exclude_frozen_parameters)
diff --git a/checkpoint-510/README.md b/checkpoint-510/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..be5c87703f12b547886cc6a2ecfbe9ee150496fa
--- /dev/null
+++ b/checkpoint-510/README.md
@@ -0,0 +1,202 @@
+---
+base_model: meta-llama/Llama-3.1-8B-Instruct
+library_name: peft
+---
+
+# Model Card for Model ID
+
+
+
+
+
+## Model Details
+
+### Model Description
+
+
+
+
+
+- **Developed by:** [More Information Needed]
+- **Funded by [optional]:** [More Information Needed]
+- **Shared by [optional]:** [More Information Needed]
+- **Model type:** [More Information Needed]
+- **Language(s) (NLP):** [More Information Needed]
+- **License:** [More Information Needed]
+- **Finetuned from model [optional]:** [More Information Needed]
+
+### Model Sources [optional]
+
+
+
+- **Repository:** [More Information Needed]
+- **Paper [optional]:** [More Information Needed]
+- **Demo [optional]:** [More Information Needed]
+
+## Uses
+
+
+
+### Direct Use
+
+
+
+[More Information Needed]
+
+### Downstream Use [optional]
+
+
+
+[More Information Needed]
+
+### Out-of-Scope Use
+
+
+
+[More Information Needed]
+
+## Bias, Risks, and Limitations
+
+
+
+[More Information Needed]
+
+### Recommendations
+
+
+
+Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
+
+## How to Get Started with the Model
+
+Use the code below to get started with the model.
+
+[More Information Needed]
+
+## Training Details
+
+### Training Data
+
+
+
+[More Information Needed]
+
+### Training Procedure
+
+
+
+#### Preprocessing [optional]
+
+[More Information Needed]
+
+
+#### Training Hyperparameters
+
+- **Training regime:** [More Information Needed]
+
+#### Speeds, Sizes, Times [optional]
+
+
+
+[More Information Needed]
+
+## Evaluation
+
+
+
+### Testing Data, Factors & Metrics
+
+#### Testing Data
+
+
+
+[More Information Needed]
+
+#### Factors
+
+
+
+[More Information Needed]
+
+#### Metrics
+
+
+
+[More Information Needed]
+
+### Results
+
+[More Information Needed]
+
+#### Summary
+
+
+
+## Model Examination [optional]
+
+
+
+[More Information Needed]
+
+## Environmental Impact
+
+
+
+Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
+
+- **Hardware Type:** [More Information Needed]
+- **Hours used:** [More Information Needed]
+- **Cloud Provider:** [More Information Needed]
+- **Compute Region:** [More Information Needed]
+- **Carbon Emitted:** [More Information Needed]
+
+## Technical Specifications [optional]
+
+### Model Architecture and Objective
+
+[More Information Needed]
+
+### Compute Infrastructure
+
+[More Information Needed]
+
+#### Hardware
+
+[More Information Needed]
+
+#### Software
+
+[More Information Needed]
+
+## Citation [optional]
+
+
+
+**BibTeX:**
+
+[More Information Needed]
+
+**APA:**
+
+[More Information Needed]
+
+## Glossary [optional]
+
+
+
+[More Information Needed]
+
+## More Information [optional]
+
+[More Information Needed]
+
+## Model Card Authors [optional]
+
+[More Information Needed]
+
+## Model Card Contact
+
+[More Information Needed]
+### Framework versions
+
+- PEFT 0.14.0
\ No newline at end of file
diff --git a/checkpoint-510/adapter_config.json b/checkpoint-510/adapter_config.json
new file mode 100644
index 0000000000000000000000000000000000000000..9dfb3ab60881d002c4cdbcc157a93958018fe683
--- /dev/null
+++ b/checkpoint-510/adapter_config.json
@@ -0,0 +1,40 @@
+{
+ "alpha_pattern": {},
+ "auto_mapping": null,
+ "base_model_name_or_path": "meta-llama/Llama-3.1-8B-Instruct",
+ "bias": "none",
+ "eva_config": null,
+ "exclude_modules": null,
+ "fan_in_fan_out": null,
+ "inference_mode": true,
+ "init_lora_weights": true,
+ "layer_replication": null,
+ "layers_pattern": null,
+ "layers_to_transform": null,
+ "loftq_config": {},
+ "lora_alpha": 512,
+ "lora_bias": false,
+ "lora_dropout": 0.05,
+ "megatron_config": null,
+ "megatron_core": "megatron.core",
+ "modules_to_save": [
+ "embed_tokens",
+ "lm_head"
+ ],
+ "peft_type": "LORA",
+ "r": 256,
+ "rank_pattern": {},
+ "revision": null,
+ "target_modules": [
+ "v_proj",
+ "up_proj",
+ "q_proj",
+ "o_proj",
+ "down_proj",
+ "gate_proj",
+ "k_proj"
+ ],
+ "task_type": "CAUSAL_LM",
+ "use_dora": false,
+ "use_rslora": false
+}
\ No newline at end of file
diff --git a/checkpoint-510/adapter_model.safetensors b/checkpoint-510/adapter_model.safetensors
new file mode 100644
index 0000000000000000000000000000000000000000..0bef38c297c36de0879e0a49295516870d706c7b
--- /dev/null
+++ b/checkpoint-510/adapter_model.safetensors
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:6d78ee61e25a9d389b642e2bc6f05ef7dcbc5f6ac8b35064353554607b639beb
+size 3443586272
diff --git a/checkpoint-510/global_step506/bf16_zero_pp_rank_0_mp_rank_00_optim_states.pt b/checkpoint-510/global_step506/bf16_zero_pp_rank_0_mp_rank_00_optim_states.pt
new file mode 100644
index 0000000000000000000000000000000000000000..3d68c546db9209b570b0bb30091dd1635443ff36
--- /dev/null
+++ b/checkpoint-510/global_step506/bf16_zero_pp_rank_0_mp_rank_00_optim_states.pt
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:46b192ecba168cd0e99f2ee3a40edf85f53a3af444986f4064df43f7ee08df90
+size 20661195036
diff --git a/checkpoint-510/global_step506/mp_rank_00_model_states.pt b/checkpoint-510/global_step506/mp_rank_00_model_states.pt
new file mode 100644
index 0000000000000000000000000000000000000000..743b3d3a138b810ad2624d38881d6db36c49dfd0
--- /dev/null
+++ b/checkpoint-510/global_step506/mp_rank_00_model_states.pt
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:0e74184ac1a319d520e4e79a8fbebc64a45497a3d3b3b1fca15e94bd4793c8e6
+size 3555326649
diff --git a/checkpoint-510/latest b/checkpoint-510/latest
new file mode 100644
index 0000000000000000000000000000000000000000..8ee04ca845fd7ccbf613e083039f3fc2f159881b
--- /dev/null
+++ b/checkpoint-510/latest
@@ -0,0 +1 @@
+global_step506
\ No newline at end of file
diff --git a/checkpoint-510/rng_state.pth b/checkpoint-510/rng_state.pth
new file mode 100644
index 0000000000000000000000000000000000000000..b9d5eeecbe324cdaea9b0b3084ec25b4e83700e4
--- /dev/null
+++ b/checkpoint-510/rng_state.pth
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:068e6b6c1b4856e05054b38b9f5ab1b55bf5bc542d8f697deb385efdf9b56e41
+size 14244
diff --git a/checkpoint-510/scheduler.pt b/checkpoint-510/scheduler.pt
new file mode 100644
index 0000000000000000000000000000000000000000..003a9c68cc3ff0657675eabd2d66f01364fd9f0c
--- /dev/null
+++ b/checkpoint-510/scheduler.pt
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:f7428ecd757a51cf94d2dbf9868f91f95ce084f9569253062b992d1c9d9f2d9c
+size 1064
diff --git a/checkpoint-510/special_tokens_map.json b/checkpoint-510/special_tokens_map.json
new file mode 100644
index 0000000000000000000000000000000000000000..278b7f0f84be865c4687700ee7b3c63d89a51e18
--- /dev/null
+++ b/checkpoint-510/special_tokens_map.json
@@ -0,0 +1,23 @@
+{
+ "bos_token": {
+ "content": "<|begin_of_text|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false
+ },
+ "eos_token": {
+ "content": "<|eot_id|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false
+ },
+ "pad_token": {
+ "content": "<|end_of_text|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false
+ }
+}
diff --git a/checkpoint-510/tokenizer.json b/checkpoint-510/tokenizer.json
new file mode 100644
index 0000000000000000000000000000000000000000..1c1d8d5c9024994f1d3b00f9662b8dd89ca13cf2
--- /dev/null
+++ b/checkpoint-510/tokenizer.json
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:6b9e4e7fb171f92fd137b777cc2714bf87d11576700a1dcd7a399e7bbe39537b
+size 17209920
diff --git a/checkpoint-510/tokenizer_config.json b/checkpoint-510/tokenizer_config.json
new file mode 100644
index 0000000000000000000000000000000000000000..ca91a2ef55f4239a7af81d7c9abb05f53621a07b
--- /dev/null
+++ b/checkpoint-510/tokenizer_config.json
@@ -0,0 +1,2064 @@
+{
+ "added_tokens_decoder": {
+ "128000": {
+ "content": "<|begin_of_text|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128001": {
+ "content": "<|end_of_text|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128002": {
+ "content": "<|reserved_special_token_0|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128003": {
+ "content": "<|reserved_special_token_1|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128004": {
+ "content": "<|finetune_right_pad_id|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128005": {
+ "content": "<|reserved_special_token_2|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128006": {
+ "content": "<|start_header_id|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128007": {
+ "content": "<|end_header_id|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128008": {
+ "content": "<|eom_id|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128009": {
+ "content": "<|eot_id|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128010": {
+ "content": "<|python_tag|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128011": {
+ "content": "<|reserved_special_token_3|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128012": {
+ "content": "<|reserved_special_token_4|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128013": {
+ "content": "<|reserved_special_token_5|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128014": {
+ "content": "<|reserved_special_token_6|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128015": {
+ "content": "<|reserved_special_token_7|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128016": {
+ "content": "<|reserved_special_token_8|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128017": {
+ "content": "<|reserved_special_token_9|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128018": {
+ "content": "<|reserved_special_token_10|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128019": {
+ "content": "<|reserved_special_token_11|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128020": {
+ "content": "<|reserved_special_token_12|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128021": {
+ "content": "<|reserved_special_token_13|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128022": {
+ "content": "<|reserved_special_token_14|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128023": {
+ "content": "<|reserved_special_token_15|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128024": {
+ "content": "<|reserved_special_token_16|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128025": {
+ "content": "<|reserved_special_token_17|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128026": {
+ "content": "<|reserved_special_token_18|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128027": {
+ "content": "<|reserved_special_token_19|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128028": {
+ "content": "<|reserved_special_token_20|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128029": {
+ "content": "<|reserved_special_token_21|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128030": {
+ "content": "<|reserved_special_token_22|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128031": {
+ "content": "<|reserved_special_token_23|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128032": {
+ "content": "<|reserved_special_token_24|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128033": {
+ "content": "<|reserved_special_token_25|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128034": {
+ "content": "<|reserved_special_token_26|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128035": {
+ "content": "<|reserved_special_token_27|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128036": {
+ "content": "<|reserved_special_token_28|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128037": {
+ "content": "<|reserved_special_token_29|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128038": {
+ "content": "<|reserved_special_token_30|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128039": {
+ "content": "<|reserved_special_token_31|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128040": {
+ "content": "<|reserved_special_token_32|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128041": {
+ "content": "<|reserved_special_token_33|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128042": {
+ "content": "<|reserved_special_token_34|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128043": {
+ "content": "<|reserved_special_token_35|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128044": {
+ "content": "<|reserved_special_token_36|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128045": {
+ "content": "<|reserved_special_token_37|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128046": {
+ "content": "<|reserved_special_token_38|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128047": {
+ "content": "<|reserved_special_token_39|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128048": {
+ "content": "<|reserved_special_token_40|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128049": {
+ "content": "<|reserved_special_token_41|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128050": {
+ "content": "<|reserved_special_token_42|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128051": {
+ "content": "<|reserved_special_token_43|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128052": {
+ "content": "<|reserved_special_token_44|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128053": {
+ "content": "<|reserved_special_token_45|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128054": {
+ "content": "<|reserved_special_token_46|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128055": {
+ "content": "<|reserved_special_token_47|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128056": {
+ "content": "<|reserved_special_token_48|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128057": {
+ "content": "<|reserved_special_token_49|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128058": {
+ "content": "<|reserved_special_token_50|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128059": {
+ "content": "<|reserved_special_token_51|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128060": {
+ "content": "<|reserved_special_token_52|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128061": {
+ "content": "<|reserved_special_token_53|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128062": {
+ "content": "<|reserved_special_token_54|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128063": {
+ "content": "<|reserved_special_token_55|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128064": {
+ "content": "<|reserved_special_token_56|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128065": {
+ "content": "<|reserved_special_token_57|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128066": {
+ "content": "<|reserved_special_token_58|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128067": {
+ "content": "<|reserved_special_token_59|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128068": {
+ "content": "<|reserved_special_token_60|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128069": {
+ "content": "<|reserved_special_token_61|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128070": {
+ "content": "<|reserved_special_token_62|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128071": {
+ "content": "<|reserved_special_token_63|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128072": {
+ "content": "<|reserved_special_token_64|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128073": {
+ "content": "<|reserved_special_token_65|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128074": {
+ "content": "<|reserved_special_token_66|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128075": {
+ "content": "<|reserved_special_token_67|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128076": {
+ "content": "<|reserved_special_token_68|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128077": {
+ "content": "<|reserved_special_token_69|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128078": {
+ "content": "<|reserved_special_token_70|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128079": {
+ "content": "<|reserved_special_token_71|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128080": {
+ "content": "<|reserved_special_token_72|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128081": {
+ "content": "<|reserved_special_token_73|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128082": {
+ "content": "<|reserved_special_token_74|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128083": {
+ "content": "<|reserved_special_token_75|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128084": {
+ "content": "<|reserved_special_token_76|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128085": {
+ "content": "<|reserved_special_token_77|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128086": {
+ "content": "<|reserved_special_token_78|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128087": {
+ "content": "<|reserved_special_token_79|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128088": {
+ "content": "<|reserved_special_token_80|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128089": {
+ "content": "<|reserved_special_token_81|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128090": {
+ "content": "<|reserved_special_token_82|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128091": {
+ "content": "<|reserved_special_token_83|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128092": {
+ "content": "<|reserved_special_token_84|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128093": {
+ "content": "<|reserved_special_token_85|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128094": {
+ "content": "<|reserved_special_token_86|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128095": {
+ "content": "<|reserved_special_token_87|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128096": {
+ "content": "<|reserved_special_token_88|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128097": {
+ "content": "<|reserved_special_token_89|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128098": {
+ "content": "<|reserved_special_token_90|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128099": {
+ "content": "<|reserved_special_token_91|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128100": {
+ "content": "<|reserved_special_token_92|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128101": {
+ "content": "<|reserved_special_token_93|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128102": {
+ "content": "<|reserved_special_token_94|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128103": {
+ "content": "<|reserved_special_token_95|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128104": {
+ "content": "<|reserved_special_token_96|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128105": {
+ "content": "<|reserved_special_token_97|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128106": {
+ "content": "<|reserved_special_token_98|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128107": {
+ "content": "<|reserved_special_token_99|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128108": {
+ "content": "<|reserved_special_token_100|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128109": {
+ "content": "<|reserved_special_token_101|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128110": {
+ "content": "<|reserved_special_token_102|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128111": {
+ "content": "<|reserved_special_token_103|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128112": {
+ "content": "<|reserved_special_token_104|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128113": {
+ "content": "<|reserved_special_token_105|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128114": {
+ "content": "<|reserved_special_token_106|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128115": {
+ "content": "<|reserved_special_token_107|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128116": {
+ "content": "<|reserved_special_token_108|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128117": {
+ "content": "<|reserved_special_token_109|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128118": {
+ "content": "<|reserved_special_token_110|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128119": {
+ "content": "<|reserved_special_token_111|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128120": {
+ "content": "<|reserved_special_token_112|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128121": {
+ "content": "<|reserved_special_token_113|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128122": {
+ "content": "<|reserved_special_token_114|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128123": {
+ "content": "<|reserved_special_token_115|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128124": {
+ "content": "<|reserved_special_token_116|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128125": {
+ "content": "<|reserved_special_token_117|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128126": {
+ "content": "<|reserved_special_token_118|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128127": {
+ "content": "<|reserved_special_token_119|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128128": {
+ "content": "<|reserved_special_token_120|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128129": {
+ "content": "<|reserved_special_token_121|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128130": {
+ "content": "<|reserved_special_token_122|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128131": {
+ "content": "<|reserved_special_token_123|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128132": {
+ "content": "<|reserved_special_token_124|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128133": {
+ "content": "<|reserved_special_token_125|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128134": {
+ "content": "<|reserved_special_token_126|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128135": {
+ "content": "<|reserved_special_token_127|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128136": {
+ "content": "<|reserved_special_token_128|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128137": {
+ "content": "<|reserved_special_token_129|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128138": {
+ "content": "<|reserved_special_token_130|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128139": {
+ "content": "<|reserved_special_token_131|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128140": {
+ "content": "<|reserved_special_token_132|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128141": {
+ "content": "<|reserved_special_token_133|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128142": {
+ "content": "<|reserved_special_token_134|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128143": {
+ "content": "<|reserved_special_token_135|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128144": {
+ "content": "<|reserved_special_token_136|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128145": {
+ "content": "<|reserved_special_token_137|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128146": {
+ "content": "<|reserved_special_token_138|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128147": {
+ "content": "<|reserved_special_token_139|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128148": {
+ "content": "<|reserved_special_token_140|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128149": {
+ "content": "<|reserved_special_token_141|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128150": {
+ "content": "<|reserved_special_token_142|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128151": {
+ "content": "<|reserved_special_token_143|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128152": {
+ "content": "<|reserved_special_token_144|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128153": {
+ "content": "<|reserved_special_token_145|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128154": {
+ "content": "<|reserved_special_token_146|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128155": {
+ "content": "<|reserved_special_token_147|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128156": {
+ "content": "<|reserved_special_token_148|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128157": {
+ "content": "<|reserved_special_token_149|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128158": {
+ "content": "<|reserved_special_token_150|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128159": {
+ "content": "<|reserved_special_token_151|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128160": {
+ "content": "<|reserved_special_token_152|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128161": {
+ "content": "<|reserved_special_token_153|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128162": {
+ "content": "<|reserved_special_token_154|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128163": {
+ "content": "<|reserved_special_token_155|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128164": {
+ "content": "<|reserved_special_token_156|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128165": {
+ "content": "<|reserved_special_token_157|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128166": {
+ "content": "<|reserved_special_token_158|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128167": {
+ "content": "<|reserved_special_token_159|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128168": {
+ "content": "<|reserved_special_token_160|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128169": {
+ "content": "<|reserved_special_token_161|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128170": {
+ "content": "<|reserved_special_token_162|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128171": {
+ "content": "<|reserved_special_token_163|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128172": {
+ "content": "<|reserved_special_token_164|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128173": {
+ "content": "<|reserved_special_token_165|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128174": {
+ "content": "<|reserved_special_token_166|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128175": {
+ "content": "<|reserved_special_token_167|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128176": {
+ "content": "<|reserved_special_token_168|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128177": {
+ "content": "<|reserved_special_token_169|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128178": {
+ "content": "<|reserved_special_token_170|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128179": {
+ "content": "<|reserved_special_token_171|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128180": {
+ "content": "<|reserved_special_token_172|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128181": {
+ "content": "<|reserved_special_token_173|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128182": {
+ "content": "<|reserved_special_token_174|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128183": {
+ "content": "<|reserved_special_token_175|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128184": {
+ "content": "<|reserved_special_token_176|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128185": {
+ "content": "<|reserved_special_token_177|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128186": {
+ "content": "<|reserved_special_token_178|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128187": {
+ "content": "<|reserved_special_token_179|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128188": {
+ "content": "<|reserved_special_token_180|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128189": {
+ "content": "<|reserved_special_token_181|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128190": {
+ "content": "<|reserved_special_token_182|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128191": {
+ "content": "<|reserved_special_token_183|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128192": {
+ "content": "<|reserved_special_token_184|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128193": {
+ "content": "<|reserved_special_token_185|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128194": {
+ "content": "<|reserved_special_token_186|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128195": {
+ "content": "<|reserved_special_token_187|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128196": {
+ "content": "<|reserved_special_token_188|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128197": {
+ "content": "<|reserved_special_token_189|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128198": {
+ "content": "<|reserved_special_token_190|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128199": {
+ "content": "<|reserved_special_token_191|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128200": {
+ "content": "<|reserved_special_token_192|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128201": {
+ "content": "<|reserved_special_token_193|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128202": {
+ "content": "<|reserved_special_token_194|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128203": {
+ "content": "<|reserved_special_token_195|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128204": {
+ "content": "<|reserved_special_token_196|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128205": {
+ "content": "<|reserved_special_token_197|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128206": {
+ "content": "<|reserved_special_token_198|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128207": {
+ "content": "<|reserved_special_token_199|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128208": {
+ "content": "<|reserved_special_token_200|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128209": {
+ "content": "<|reserved_special_token_201|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128210": {
+ "content": "<|reserved_special_token_202|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128211": {
+ "content": "<|reserved_special_token_203|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128212": {
+ "content": "<|reserved_special_token_204|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128213": {
+ "content": "<|reserved_special_token_205|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128214": {
+ "content": "<|reserved_special_token_206|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128215": {
+ "content": "<|reserved_special_token_207|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128216": {
+ "content": "<|reserved_special_token_208|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128217": {
+ "content": "<|reserved_special_token_209|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128218": {
+ "content": "<|reserved_special_token_210|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128219": {
+ "content": "<|reserved_special_token_211|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128220": {
+ "content": "<|reserved_special_token_212|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128221": {
+ "content": "<|reserved_special_token_213|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128222": {
+ "content": "<|reserved_special_token_214|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128223": {
+ "content": "<|reserved_special_token_215|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128224": {
+ "content": "<|reserved_special_token_216|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128225": {
+ "content": "<|reserved_special_token_217|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128226": {
+ "content": "<|reserved_special_token_218|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128227": {
+ "content": "<|reserved_special_token_219|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128228": {
+ "content": "<|reserved_special_token_220|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128229": {
+ "content": "<|reserved_special_token_221|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128230": {
+ "content": "<|reserved_special_token_222|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128231": {
+ "content": "<|reserved_special_token_223|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128232": {
+ "content": "<|reserved_special_token_224|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128233": {
+ "content": "<|reserved_special_token_225|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128234": {
+ "content": "<|reserved_special_token_226|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128235": {
+ "content": "<|reserved_special_token_227|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128236": {
+ "content": "<|reserved_special_token_228|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128237": {
+ "content": "<|reserved_special_token_229|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128238": {
+ "content": "<|reserved_special_token_230|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128239": {
+ "content": "<|reserved_special_token_231|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128240": {
+ "content": "<|reserved_special_token_232|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128241": {
+ "content": "<|reserved_special_token_233|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128242": {
+ "content": "<|reserved_special_token_234|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128243": {
+ "content": "<|reserved_special_token_235|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128244": {
+ "content": "<|reserved_special_token_236|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128245": {
+ "content": "<|reserved_special_token_237|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128246": {
+ "content": "<|reserved_special_token_238|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128247": {
+ "content": "<|reserved_special_token_239|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128248": {
+ "content": "<|reserved_special_token_240|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128249": {
+ "content": "<|reserved_special_token_241|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128250": {
+ "content": "<|reserved_special_token_242|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128251": {
+ "content": "<|reserved_special_token_243|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128252": {
+ "content": "<|reserved_special_token_244|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128253": {
+ "content": "<|reserved_special_token_245|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128254": {
+ "content": "<|reserved_special_token_246|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ },
+ "128255": {
+ "content": "<|reserved_special_token_247|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false,
+ "special": true
+ }
+ },
+ "bos_token": "<|begin_of_text|>",
+ "chat_template": "{{- bos_token }}\n{%- if custom_tools is defined %}\n {%- set tools = custom_tools %}\n{%- endif %}\n{%- if not tools_in_user_message is defined %}\n {%- set tools_in_user_message = true %}\n{%- endif %}\n{%- if not date_string is defined %}\n {%- set date_string = \"26 Jul 2024\" %}\n{%- endif %}\n{%- if not tools is defined %}\n {%- set tools = none %}\n{%- endif %}\n\n{#- This block extracts the system message, so we can slot it into the right place. #}\n{%- if messages[0]['role'] == 'system' %}\n {%- set system_message = messages[0]['content']|trim %}\n {%- set messages = messages[1:] %}\n{%- else %}\n {%- set system_message = \"\" %}\n{%- endif %}\n\n{#- System message + builtin tools #}\n{{- \"<|start_header_id|>system<|end_header_id|>\\n\\n\" }}\n{%- if builtin_tools is defined or tools is not none %}\n {{- \"Environment: ipython\\n\" }}\n{%- endif %}\n{%- if builtin_tools is defined %}\n {{- \"Tools: \" + builtin_tools | reject('equalto', 'code_interpreter') | join(\", \") + \"\\n\\n\"}}\n{%- endif %}\n{{- \"Cutting Knowledge Date: December 2023\\n\" }}\n{{- \"Today Date: \" + date_string + \"\\n\\n\" }}\n{%- if tools is not none and not tools_in_user_message %}\n {{- \"You have access to the following functions. To call a function, please respond with JSON for a function call.\" }}\n {{- 'Respond in the format {\"name\": function name, \"parameters\": dictionary of argument name and its value}.' }}\n {{- \"Do not use variables.\\n\\n\" }}\n {%- for t in tools %}\n {{- t | tojson(indent=4) }}\n {{- \"\\n\\n\" }}\n {%- endfor %}\n{%- endif %}\n{{- system_message }}\n{{- \"<|eot_id|>\" }}\n\n{#- Custom tools are passed in a user message with some extra guidance #}\n{%- if tools_in_user_message and not tools is none %}\n {#- Extract the first user message so we can plug it in here #}\n {%- if messages | length != 0 %}\n {%- set first_user_message = messages[0]['content']|trim %}\n {%- set messages = messages[1:] %}\n {%- else %}\n {{- raise_exception(\"Cannot put tools in the first user message when there's no first user message!\") }}\n{%- endif %}\n {{- '<|start_header_id|>user<|end_header_id|>\\n\\n' -}}\n {{- \"Given the following functions, please respond with a JSON for a function call \" }}\n {{- \"with its proper arguments that best answers the given prompt.\\n\\n\" }}\n {{- 'Respond in the format {\"name\": function name, \"parameters\": dictionary of argument name and its value}.' }}\n {{- \"Do not use variables.\\n\\n\" }}\n {%- for t in tools %}\n {{- t | tojson(indent=4) }}\n {{- \"\\n\\n\" }}\n {%- endfor %}\n {{- first_user_message + \"<|eot_id|>\"}}\n{%- endif %}\n\n{%- for message in messages %}\n {%- if not (message.role == 'ipython' or message.role == 'tool' or 'tool_calls' in message) %}\n {{- '<|start_header_id|>' + message['role'] + '<|end_header_id|>\\n\\n'+ message['content'] | trim + '<|eot_id|>' }}\n {%- elif 'tool_calls' in message %}\n {%- if not message.tool_calls|length == 1 %}\n {{- raise_exception(\"This model only supports single tool-calls at once!\") }}\n {%- endif %}\n {%- set tool_call = message.tool_calls[0].function %}\n {%- if builtin_tools is defined and tool_call.name in builtin_tools %}\n {{- '<|start_header_id|>assistant<|end_header_id|>\\n\\n' -}}\n {{- \"<|python_tag|>\" + tool_call.name + \".call(\" }}\n {%- for arg_name, arg_val in tool_call.arguments | items %}\n {{- arg_name + '=\"' + arg_val + '\"' }}\n {%- if not loop.last %}\n {{- \", \" }}\n {%- endif %}\n {%- endfor %}\n {{- \")\" }}\n {%- else %}\n {{- '<|start_header_id|>assistant<|end_header_id|>\\n\\n' -}}\n {{- '{\"name\": \"' + tool_call.name + '\", ' }}\n {{- '\"parameters\": ' }}\n {{- tool_call.arguments | tojson }}\n {{- \"}\" }}\n {%- endif %}\n {%- if builtin_tools is defined %}\n {#- This means we're in ipython mode #}\n {{- \"<|eom_id|>\" }}\n {%- else %}\n {{- \"<|eot_id|>\" }}\n {%- endif %}\n {%- elif message.role == \"tool\" or message.role == \"ipython\" %}\n {{- \"<|start_header_id|>ipython<|end_header_id|>\\n\\n\" }}\n {%- if message.content is mapping or message.content is iterable %}\n {{- message.content | tojson }}\n {%- else %}\n {{- message.content }}\n {%- endif %}\n {{- \"<|eot_id|>\" }}\n {%- endif %}\n{%- endfor %}\n{%- if add_generation_prompt %}\n {{- '<|start_header_id|>assistant<|end_header_id|>\\n\\n' }}\n{%- endif %}\n",
+ "clean_up_tokenization_spaces": true,
+ "eos_token": "<|eot_id|>",
+ "extra_special_tokens": {},
+ "model_input_names": [
+ "input_ids",
+ "attention_mask"
+ ],
+ "model_max_length": 131072,
+ "pad_token": "<|end_of_text|>",
+ "tokenizer_class": "PreTrainedTokenizer"
+}
diff --git a/checkpoint-510/trainer_state.json b/checkpoint-510/trainer_state.json
new file mode 100644
index 0000000000000000000000000000000000000000..abfea951227463cde7db87e94aff1638159567f6
--- /dev/null
+++ b/checkpoint-510/trainer_state.json
@@ -0,0 +1,3603 @@
+{
+ "best_metric": null,
+ "best_model_checkpoint": null,
+ "epoch": 5.9375,
+ "eval_steps": 500,
+ "global_step": 510,
+ "is_hyper_param_search": false,
+ "is_local_process_zero": true,
+ "is_world_process_zero": true,
+ "log_history": [
+ {
+ "epoch": 0.01171875,
+ "grad_norm": 36.23282241821289,
+ "learning_rate": 5.0000000000000004e-08,
+ "loss": 2.3839,
+ "step": 1
+ },
+ {
+ "epoch": 0.0234375,
+ "grad_norm": 35.918636322021484,
+ "learning_rate": 1.0000000000000001e-07,
+ "loss": 2.3798,
+ "step": 2
+ },
+ {
+ "epoch": 0.03515625,
+ "grad_norm": 35.62618637084961,
+ "learning_rate": 1.5000000000000002e-07,
+ "loss": 2.386,
+ "step": 3
+ },
+ {
+ "epoch": 0.046875,
+ "grad_norm": 35.966087341308594,
+ "learning_rate": 2.0000000000000002e-07,
+ "loss": 2.3803,
+ "step": 4
+ },
+ {
+ "epoch": 0.05859375,
+ "grad_norm": 35.38177490234375,
+ "learning_rate": 2.5000000000000004e-07,
+ "loss": 2.3937,
+ "step": 5
+ },
+ {
+ "epoch": 0.0703125,
+ "grad_norm": 35.99677658081055,
+ "learning_rate": 3.0000000000000004e-07,
+ "loss": 2.3906,
+ "step": 6
+ },
+ {
+ "epoch": 0.08203125,
+ "grad_norm": 35.44341278076172,
+ "learning_rate": 3.5000000000000004e-07,
+ "loss": 2.3539,
+ "step": 7
+ },
+ {
+ "epoch": 0.09375,
+ "grad_norm": 35.300697326660156,
+ "learning_rate": 4.0000000000000003e-07,
+ "loss": 2.3459,
+ "step": 8
+ },
+ {
+ "epoch": 0.10546875,
+ "grad_norm": 34.092952728271484,
+ "learning_rate": 4.5000000000000003e-07,
+ "loss": 2.2959,
+ "step": 9
+ },
+ {
+ "epoch": 0.1171875,
+ "grad_norm": 34.46371841430664,
+ "learning_rate": 5.000000000000001e-07,
+ "loss": 2.2661,
+ "step": 10
+ },
+ {
+ "epoch": 0.12890625,
+ "grad_norm": 34.62260818481445,
+ "learning_rate": 5.5e-07,
+ "loss": 2.2918,
+ "step": 11
+ },
+ {
+ "epoch": 0.140625,
+ "grad_norm": 33.790374755859375,
+ "learning_rate": 6.000000000000001e-07,
+ "loss": 2.223,
+ "step": 12
+ },
+ {
+ "epoch": 0.15234375,
+ "grad_norm": 33.766536712646484,
+ "learning_rate": 6.5e-07,
+ "loss": 2.2267,
+ "step": 13
+ },
+ {
+ "epoch": 0.1640625,
+ "grad_norm": 33.894081115722656,
+ "learning_rate": 7.000000000000001e-07,
+ "loss": 2.1465,
+ "step": 14
+ },
+ {
+ "epoch": 0.17578125,
+ "grad_norm": 33.162452697753906,
+ "learning_rate": 7.5e-07,
+ "loss": 2.0495,
+ "step": 15
+ },
+ {
+ "epoch": 0.1875,
+ "grad_norm": 32.954341888427734,
+ "learning_rate": 8.000000000000001e-07,
+ "loss": 1.9627,
+ "step": 16
+ },
+ {
+ "epoch": 0.19921875,
+ "grad_norm": 33.96324157714844,
+ "learning_rate": 8.500000000000001e-07,
+ "loss": 1.8867,
+ "step": 17
+ },
+ {
+ "epoch": 0.2109375,
+ "grad_norm": 33.81139373779297,
+ "learning_rate": 9.000000000000001e-07,
+ "loss": 1.7752,
+ "step": 18
+ },
+ {
+ "epoch": 0.22265625,
+ "grad_norm": 34.87086868286133,
+ "learning_rate": 9.500000000000001e-07,
+ "loss": 1.6944,
+ "step": 19
+ },
+ {
+ "epoch": 0.234375,
+ "grad_norm": 34.84965133666992,
+ "learning_rate": 1.0000000000000002e-06,
+ "loss": 1.5707,
+ "step": 20
+ },
+ {
+ "epoch": 0.24609375,
+ "grad_norm": 35.227317810058594,
+ "learning_rate": 1.0500000000000001e-06,
+ "loss": 1.4369,
+ "step": 21
+ },
+ {
+ "epoch": 0.2578125,
+ "grad_norm": 34.91344451904297,
+ "learning_rate": 1.1e-06,
+ "loss": 1.3202,
+ "step": 22
+ },
+ {
+ "epoch": 0.26953125,
+ "grad_norm": 31.7376766204834,
+ "learning_rate": 1.1500000000000002e-06,
+ "loss": 1.1398,
+ "step": 23
+ },
+ {
+ "epoch": 0.28125,
+ "grad_norm": 30.24741554260254,
+ "learning_rate": 1.2000000000000002e-06,
+ "loss": 1.0421,
+ "step": 24
+ },
+ {
+ "epoch": 0.29296875,
+ "grad_norm": 28.292400360107422,
+ "learning_rate": 1.25e-06,
+ "loss": 0.8817,
+ "step": 25
+ },
+ {
+ "epoch": 0.3046875,
+ "grad_norm": 30.44672393798828,
+ "learning_rate": 1.3e-06,
+ "loss": 0.7073,
+ "step": 26
+ },
+ {
+ "epoch": 0.31640625,
+ "grad_norm": 29.416427612304688,
+ "learning_rate": 1.3500000000000002e-06,
+ "loss": 0.5444,
+ "step": 27
+ },
+ {
+ "epoch": 0.328125,
+ "grad_norm": 24.820096969604492,
+ "learning_rate": 1.4000000000000001e-06,
+ "loss": 0.4025,
+ "step": 28
+ },
+ {
+ "epoch": 0.33984375,
+ "grad_norm": 21.023277282714844,
+ "learning_rate": 1.45e-06,
+ "loss": 0.307,
+ "step": 29
+ },
+ {
+ "epoch": 0.3515625,
+ "grad_norm": 19.656967163085938,
+ "learning_rate": 1.5e-06,
+ "loss": 0.2151,
+ "step": 30
+ },
+ {
+ "epoch": 0.36328125,
+ "grad_norm": 14.91929817199707,
+ "learning_rate": 1.5500000000000002e-06,
+ "loss": 0.1448,
+ "step": 31
+ },
+ {
+ "epoch": 0.375,
+ "grad_norm": 5.083199977874756,
+ "learning_rate": 1.6000000000000001e-06,
+ "loss": 0.09,
+ "step": 32
+ },
+ {
+ "epoch": 0.38671875,
+ "grad_norm": 2.320681571960449,
+ "learning_rate": 1.6500000000000003e-06,
+ "loss": 0.0641,
+ "step": 33
+ },
+ {
+ "epoch": 0.3984375,
+ "grad_norm": 1.6233159303665161,
+ "learning_rate": 1.7000000000000002e-06,
+ "loss": 0.0584,
+ "step": 34
+ },
+ {
+ "epoch": 0.41015625,
+ "grad_norm": 1.6057201623916626,
+ "learning_rate": 1.75e-06,
+ "loss": 0.0626,
+ "step": 35
+ },
+ {
+ "epoch": 0.421875,
+ "grad_norm": 1.8360320329666138,
+ "learning_rate": 1.8000000000000001e-06,
+ "loss": 0.0563,
+ "step": 36
+ },
+ {
+ "epoch": 0.43359375,
+ "grad_norm": 1.736350178718567,
+ "learning_rate": 1.85e-06,
+ "loss": 0.0609,
+ "step": 37
+ },
+ {
+ "epoch": 0.4453125,
+ "grad_norm": 1.1473922729492188,
+ "learning_rate": 1.9000000000000002e-06,
+ "loss": 0.0541,
+ "step": 38
+ },
+ {
+ "epoch": 0.45703125,
+ "grad_norm": 1.1722168922424316,
+ "learning_rate": 1.9500000000000004e-06,
+ "loss": 0.0534,
+ "step": 39
+ },
+ {
+ "epoch": 0.46875,
+ "grad_norm": 1.356987714767456,
+ "learning_rate": 2.0000000000000003e-06,
+ "loss": 0.0496,
+ "step": 40
+ },
+ {
+ "epoch": 0.48046875,
+ "grad_norm": 0.8023216724395752,
+ "learning_rate": 2.05e-06,
+ "loss": 0.0527,
+ "step": 41
+ },
+ {
+ "epoch": 0.4921875,
+ "grad_norm": 0.9803515672683716,
+ "learning_rate": 2.1000000000000002e-06,
+ "loss": 0.0478,
+ "step": 42
+ },
+ {
+ "epoch": 0.50390625,
+ "grad_norm": 0.8733468651771545,
+ "learning_rate": 2.15e-06,
+ "loss": 0.052,
+ "step": 43
+ },
+ {
+ "epoch": 0.515625,
+ "grad_norm": 0.8213743567466736,
+ "learning_rate": 2.2e-06,
+ "loss": 0.0448,
+ "step": 44
+ },
+ {
+ "epoch": 0.52734375,
+ "grad_norm": 0.843189537525177,
+ "learning_rate": 2.25e-06,
+ "loss": 0.0498,
+ "step": 45
+ },
+ {
+ "epoch": 0.5390625,
+ "grad_norm": 0.8801079392433167,
+ "learning_rate": 2.3000000000000004e-06,
+ "loss": 0.0408,
+ "step": 46
+ },
+ {
+ "epoch": 0.55078125,
+ "grad_norm": 0.7131401300430298,
+ "learning_rate": 2.35e-06,
+ "loss": 0.0405,
+ "step": 47
+ },
+ {
+ "epoch": 0.5625,
+ "grad_norm": 0.8996126651763916,
+ "learning_rate": 2.4000000000000003e-06,
+ "loss": 0.0525,
+ "step": 48
+ },
+ {
+ "epoch": 0.57421875,
+ "grad_norm": 0.8606986403465271,
+ "learning_rate": 2.4500000000000003e-06,
+ "loss": 0.0438,
+ "step": 49
+ },
+ {
+ "epoch": 0.5859375,
+ "grad_norm": 0.6918051838874817,
+ "learning_rate": 2.5e-06,
+ "loss": 0.0394,
+ "step": 50
+ },
+ {
+ "epoch": 0.59765625,
+ "grad_norm": 0.6177802085876465,
+ "learning_rate": 2.55e-06,
+ "loss": 0.0387,
+ "step": 51
+ },
+ {
+ "epoch": 0.609375,
+ "grad_norm": 0.7042555809020996,
+ "learning_rate": 2.6e-06,
+ "loss": 0.0434,
+ "step": 52
+ },
+ {
+ "epoch": 0.62109375,
+ "grad_norm": 0.6537717580795288,
+ "learning_rate": 2.6500000000000005e-06,
+ "loss": 0.0396,
+ "step": 53
+ },
+ {
+ "epoch": 0.6328125,
+ "grad_norm": 0.7834082841873169,
+ "learning_rate": 2.7000000000000004e-06,
+ "loss": 0.0411,
+ "step": 54
+ },
+ {
+ "epoch": 0.64453125,
+ "grad_norm": 0.7287272810935974,
+ "learning_rate": 2.7500000000000004e-06,
+ "loss": 0.0408,
+ "step": 55
+ },
+ {
+ "epoch": 0.65625,
+ "grad_norm": 0.7186263203620911,
+ "learning_rate": 2.8000000000000003e-06,
+ "loss": 0.0394,
+ "step": 56
+ },
+ {
+ "epoch": 0.66796875,
+ "grad_norm": 0.7264899611473083,
+ "learning_rate": 2.85e-06,
+ "loss": 0.0427,
+ "step": 57
+ },
+ {
+ "epoch": 0.6796875,
+ "grad_norm": 0.7665618062019348,
+ "learning_rate": 2.9e-06,
+ "loss": 0.0368,
+ "step": 58
+ },
+ {
+ "epoch": 0.69140625,
+ "grad_norm": 0.7222962379455566,
+ "learning_rate": 2.95e-06,
+ "loss": 0.0412,
+ "step": 59
+ },
+ {
+ "epoch": 0.703125,
+ "grad_norm": 0.7061101794242859,
+ "learning_rate": 3e-06,
+ "loss": 0.0377,
+ "step": 60
+ },
+ {
+ "epoch": 0.71484375,
+ "grad_norm": 0.5724324584007263,
+ "learning_rate": 3.05e-06,
+ "loss": 0.0387,
+ "step": 61
+ },
+ {
+ "epoch": 0.7265625,
+ "grad_norm": 0.5535506010055542,
+ "learning_rate": 3.1000000000000004e-06,
+ "loss": 0.0403,
+ "step": 62
+ },
+ {
+ "epoch": 0.73828125,
+ "grad_norm": 0.6553678512573242,
+ "learning_rate": 3.1500000000000003e-06,
+ "loss": 0.0415,
+ "step": 63
+ },
+ {
+ "epoch": 0.75,
+ "grad_norm": 0.6137285828590393,
+ "learning_rate": 3.2000000000000003e-06,
+ "loss": 0.0383,
+ "step": 64
+ },
+ {
+ "epoch": 0.76171875,
+ "grad_norm": 0.5985754132270813,
+ "learning_rate": 3.2500000000000002e-06,
+ "loss": 0.0355,
+ "step": 65
+ },
+ {
+ "epoch": 0.7734375,
+ "grad_norm": 0.5903909802436829,
+ "learning_rate": 3.3000000000000006e-06,
+ "loss": 0.0374,
+ "step": 66
+ },
+ {
+ "epoch": 0.78515625,
+ "grad_norm": 0.5718765258789062,
+ "learning_rate": 3.3500000000000005e-06,
+ "loss": 0.0339,
+ "step": 67
+ },
+ {
+ "epoch": 0.796875,
+ "grad_norm": 0.6844965815544128,
+ "learning_rate": 3.4000000000000005e-06,
+ "loss": 0.0405,
+ "step": 68
+ },
+ {
+ "epoch": 0.80859375,
+ "grad_norm": 0.5959618091583252,
+ "learning_rate": 3.45e-06,
+ "loss": 0.0338,
+ "step": 69
+ },
+ {
+ "epoch": 0.8203125,
+ "grad_norm": 0.6095123291015625,
+ "learning_rate": 3.5e-06,
+ "loss": 0.0362,
+ "step": 70
+ },
+ {
+ "epoch": 0.83203125,
+ "grad_norm": 0.543708086013794,
+ "learning_rate": 3.5500000000000003e-06,
+ "loss": 0.0355,
+ "step": 71
+ },
+ {
+ "epoch": 0.84375,
+ "grad_norm": 0.6969983577728271,
+ "learning_rate": 3.6000000000000003e-06,
+ "loss": 0.0325,
+ "step": 72
+ },
+ {
+ "epoch": 0.85546875,
+ "grad_norm": 0.6022969484329224,
+ "learning_rate": 3.65e-06,
+ "loss": 0.0342,
+ "step": 73
+ },
+ {
+ "epoch": 0.8671875,
+ "grad_norm": 0.6262147426605225,
+ "learning_rate": 3.7e-06,
+ "loss": 0.0348,
+ "step": 74
+ },
+ {
+ "epoch": 0.87890625,
+ "grad_norm": 0.5729933381080627,
+ "learning_rate": 3.7500000000000005e-06,
+ "loss": 0.0318,
+ "step": 75
+ },
+ {
+ "epoch": 0.890625,
+ "grad_norm": 0.5846775770187378,
+ "learning_rate": 3.8000000000000005e-06,
+ "loss": 0.0309,
+ "step": 76
+ },
+ {
+ "epoch": 0.90234375,
+ "grad_norm": 0.6469219923019409,
+ "learning_rate": 3.85e-06,
+ "loss": 0.0324,
+ "step": 77
+ },
+ {
+ "epoch": 0.9140625,
+ "grad_norm": 0.6574859023094177,
+ "learning_rate": 3.900000000000001e-06,
+ "loss": 0.0325,
+ "step": 78
+ },
+ {
+ "epoch": 0.92578125,
+ "grad_norm": 0.5833832025527954,
+ "learning_rate": 3.95e-06,
+ "loss": 0.0232,
+ "step": 79
+ },
+ {
+ "epoch": 0.9375,
+ "grad_norm": 0.7503570318222046,
+ "learning_rate": 4.000000000000001e-06,
+ "loss": 0.0267,
+ "step": 80
+ },
+ {
+ "epoch": 0.94921875,
+ "grad_norm": 0.7181633114814758,
+ "learning_rate": 4.05e-06,
+ "loss": 0.0304,
+ "step": 81
+ },
+ {
+ "epoch": 0.9609375,
+ "grad_norm": 0.6477274298667908,
+ "learning_rate": 4.1e-06,
+ "loss": 0.0297,
+ "step": 82
+ },
+ {
+ "epoch": 0.97265625,
+ "grad_norm": 0.6768563389778137,
+ "learning_rate": 4.15e-06,
+ "loss": 0.0279,
+ "step": 83
+ },
+ {
+ "epoch": 0.984375,
+ "grad_norm": 0.7905837297439575,
+ "learning_rate": 4.2000000000000004e-06,
+ "loss": 0.0301,
+ "step": 84
+ },
+ {
+ "epoch": 0.99609375,
+ "grad_norm": 0.5576608777046204,
+ "learning_rate": 4.25e-06,
+ "loss": 0.0322,
+ "step": 85
+ },
+ {
+ "epoch": 1.0,
+ "grad_norm": 0.5576608777046204,
+ "learning_rate": 4.3e-06,
+ "loss": 0.0226,
+ "step": 86
+ },
+ {
+ "epoch": 1.01171875,
+ "grad_norm": 1.0774812698364258,
+ "learning_rate": 4.350000000000001e-06,
+ "loss": 0.0215,
+ "step": 87
+ },
+ {
+ "epoch": 1.0234375,
+ "grad_norm": 0.47373324632644653,
+ "learning_rate": 4.4e-06,
+ "loss": 0.0235,
+ "step": 88
+ },
+ {
+ "epoch": 1.03515625,
+ "grad_norm": 0.7665970325469971,
+ "learning_rate": 4.450000000000001e-06,
+ "loss": 0.0242,
+ "step": 89
+ },
+ {
+ "epoch": 1.046875,
+ "grad_norm": 0.6290147304534912,
+ "learning_rate": 4.5e-06,
+ "loss": 0.0209,
+ "step": 90
+ },
+ {
+ "epoch": 1.05859375,
+ "grad_norm": 0.5703024864196777,
+ "learning_rate": 4.5500000000000005e-06,
+ "loss": 0.0192,
+ "step": 91
+ },
+ {
+ "epoch": 1.0703125,
+ "grad_norm": 0.6099259853363037,
+ "learning_rate": 4.600000000000001e-06,
+ "loss": 0.0181,
+ "step": 92
+ },
+ {
+ "epoch": 1.08203125,
+ "grad_norm": 0.6570988297462463,
+ "learning_rate": 4.65e-06,
+ "loss": 0.0201,
+ "step": 93
+ },
+ {
+ "epoch": 1.09375,
+ "grad_norm": 0.7848325371742249,
+ "learning_rate": 4.7e-06,
+ "loss": 0.0253,
+ "step": 94
+ },
+ {
+ "epoch": 1.10546875,
+ "grad_norm": 0.6759209036827087,
+ "learning_rate": 4.75e-06,
+ "loss": 0.0195,
+ "step": 95
+ },
+ {
+ "epoch": 1.1171875,
+ "grad_norm": 0.4861151874065399,
+ "learning_rate": 4.800000000000001e-06,
+ "loss": 0.0191,
+ "step": 96
+ },
+ {
+ "epoch": 1.12890625,
+ "grad_norm": 0.6268576383590698,
+ "learning_rate": 4.85e-06,
+ "loss": 0.0211,
+ "step": 97
+ },
+ {
+ "epoch": 1.140625,
+ "grad_norm": 0.5862017869949341,
+ "learning_rate": 4.9000000000000005e-06,
+ "loss": 0.0177,
+ "step": 98
+ },
+ {
+ "epoch": 1.15234375,
+ "grad_norm": 0.4569724202156067,
+ "learning_rate": 4.95e-06,
+ "loss": 0.0164,
+ "step": 99
+ },
+ {
+ "epoch": 1.1640625,
+ "grad_norm": 0.4539048969745636,
+ "learning_rate": 5e-06,
+ "loss": 0.0152,
+ "step": 100
+ },
+ {
+ "epoch": 1.17578125,
+ "grad_norm": 0.4553528428077698,
+ "learning_rate": 4.999926609487568e-06,
+ "loss": 0.0208,
+ "step": 101
+ },
+ {
+ "epoch": 1.1875,
+ "grad_norm": 0.5182592272758484,
+ "learning_rate": 4.999706442259205e-06,
+ "loss": 0.0154,
+ "step": 102
+ },
+ {
+ "epoch": 1.19921875,
+ "grad_norm": 0.5602673888206482,
+ "learning_rate": 4.999339511241458e-06,
+ "loss": 0.0196,
+ "step": 103
+ },
+ {
+ "epoch": 1.2109375,
+ "grad_norm": 0.7579494118690491,
+ "learning_rate": 4.9988258379777334e-06,
+ "loss": 0.0198,
+ "step": 104
+ },
+ {
+ "epoch": 1.22265625,
+ "grad_norm": 0.603757381439209,
+ "learning_rate": 4.998165452627025e-06,
+ "loss": 0.0185,
+ "step": 105
+ },
+ {
+ "epoch": 1.234375,
+ "grad_norm": 0.5520291924476624,
+ "learning_rate": 4.99735839396215e-06,
+ "loss": 0.018,
+ "step": 106
+ },
+ {
+ "epoch": 1.24609375,
+ "grad_norm": 0.55808424949646,
+ "learning_rate": 4.996404709367466e-06,
+ "loss": 0.0159,
+ "step": 107
+ },
+ {
+ "epoch": 1.2578125,
+ "grad_norm": 0.47174298763275146,
+ "learning_rate": 4.995304454836095e-06,
+ "loss": 0.0122,
+ "step": 108
+ },
+ {
+ "epoch": 1.26953125,
+ "grad_norm": 0.5289337038993835,
+ "learning_rate": 4.994057694966632e-06,
+ "loss": 0.0168,
+ "step": 109
+ },
+ {
+ "epoch": 1.28125,
+ "grad_norm": 0.5390430092811584,
+ "learning_rate": 4.992664502959351e-06,
+ "loss": 0.017,
+ "step": 110
+ },
+ {
+ "epoch": 1.29296875,
+ "grad_norm": 0.4966451823711395,
+ "learning_rate": 4.991124960611916e-06,
+ "loss": 0.0145,
+ "step": 111
+ },
+ {
+ "epoch": 1.3046875,
+ "grad_norm": 0.6148604154586792,
+ "learning_rate": 4.989439158314566e-06,
+ "loss": 0.0139,
+ "step": 112
+ },
+ {
+ "epoch": 1.31640625,
+ "grad_norm": 0.6303534507751465,
+ "learning_rate": 4.9876071950448185e-06,
+ "loss": 0.0118,
+ "step": 113
+ },
+ {
+ "epoch": 1.328125,
+ "grad_norm": 0.5410207509994507,
+ "learning_rate": 4.98562917836165e-06,
+ "loss": 0.0094,
+ "step": 114
+ },
+ {
+ "epoch": 1.33984375,
+ "grad_norm": 0.5350080132484436,
+ "learning_rate": 4.983505224399188e-06,
+ "loss": 0.0158,
+ "step": 115
+ },
+ {
+ "epoch": 1.3515625,
+ "grad_norm": 1.017317295074463,
+ "learning_rate": 4.9812354578598876e-06,
+ "loss": 0.0201,
+ "step": 116
+ },
+ {
+ "epoch": 1.36328125,
+ "grad_norm": 0.6891007423400879,
+ "learning_rate": 4.978820012007213e-06,
+ "loss": 0.0127,
+ "step": 117
+ },
+ {
+ "epoch": 1.375,
+ "grad_norm": 0.4756389260292053,
+ "learning_rate": 4.976259028657812e-06,
+ "loss": 0.0188,
+ "step": 118
+ },
+ {
+ "epoch": 1.38671875,
+ "grad_norm": 0.5957350730895996,
+ "learning_rate": 4.973552658173186e-06,
+ "loss": 0.011,
+ "step": 119
+ },
+ {
+ "epoch": 1.3984375,
+ "grad_norm": 0.5012223720550537,
+ "learning_rate": 4.970701059450872e-06,
+ "loss": 0.0138,
+ "step": 120
+ },
+ {
+ "epoch": 1.41015625,
+ "grad_norm": 0.4408419132232666,
+ "learning_rate": 4.9677043999151e-06,
+ "loss": 0.0144,
+ "step": 121
+ },
+ {
+ "epoch": 1.421875,
+ "grad_norm": 0.5721736550331116,
+ "learning_rate": 4.964562855506976e-06,
+ "loss": 0.0135,
+ "step": 122
+ },
+ {
+ "epoch": 1.43359375,
+ "grad_norm": 0.5479208827018738,
+ "learning_rate": 4.961276610674141e-06,
+ "loss": 0.0128,
+ "step": 123
+ },
+ {
+ "epoch": 1.4453125,
+ "grad_norm": 1.0117675065994263,
+ "learning_rate": 4.9578458583599495e-06,
+ "loss": 0.0111,
+ "step": 124
+ },
+ {
+ "epoch": 1.45703125,
+ "grad_norm": 0.5504026412963867,
+ "learning_rate": 4.954270799992138e-06,
+ "loss": 0.0083,
+ "step": 125
+ },
+ {
+ "epoch": 1.46875,
+ "grad_norm": 0.48403099179267883,
+ "learning_rate": 4.950551645470998e-06,
+ "loss": 0.0083,
+ "step": 126
+ },
+ {
+ "epoch": 1.48046875,
+ "grad_norm": 0.6866800785064697,
+ "learning_rate": 4.9466886131570565e-06,
+ "loss": 0.0085,
+ "step": 127
+ },
+ {
+ "epoch": 1.4921875,
+ "grad_norm": 0.872557520866394,
+ "learning_rate": 4.942681929858249e-06,
+ "loss": 0.0102,
+ "step": 128
+ },
+ {
+ "epoch": 1.50390625,
+ "grad_norm": 0.6924716234207153,
+ "learning_rate": 4.9385318308166065e-06,
+ "loss": 0.012,
+ "step": 129
+ },
+ {
+ "epoch": 1.515625,
+ "grad_norm": 0.5060118436813354,
+ "learning_rate": 4.934238559694448e-06,
+ "loss": 0.0084,
+ "step": 130
+ },
+ {
+ "epoch": 1.52734375,
+ "grad_norm": 0.6256171464920044,
+ "learning_rate": 4.929802368560066e-06,
+ "loss": 0.0081,
+ "step": 131
+ },
+ {
+ "epoch": 1.5390625,
+ "grad_norm": 0.5422537922859192,
+ "learning_rate": 4.925223517872934e-06,
+ "loss": 0.0077,
+ "step": 132
+ },
+ {
+ "epoch": 1.55078125,
+ "grad_norm": 0.953416109085083,
+ "learning_rate": 4.920502276468408e-06,
+ "loss": 0.0078,
+ "step": 133
+ },
+ {
+ "epoch": 1.5625,
+ "grad_norm": 0.4540804624557495,
+ "learning_rate": 4.915638921541952e-06,
+ "loss": 0.0097,
+ "step": 134
+ },
+ {
+ "epoch": 1.57421875,
+ "grad_norm": 0.3773641884326935,
+ "learning_rate": 4.9106337386328524e-06,
+ "loss": 0.0098,
+ "step": 135
+ },
+ {
+ "epoch": 1.5859375,
+ "grad_norm": 0.7970175743103027,
+ "learning_rate": 4.905487021607462e-06,
+ "loss": 0.0056,
+ "step": 136
+ },
+ {
+ "epoch": 1.59765625,
+ "grad_norm": 0.45197635889053345,
+ "learning_rate": 4.900199072641937e-06,
+ "loss": 0.0078,
+ "step": 137
+ },
+ {
+ "epoch": 1.609375,
+ "grad_norm": 0.38231438398361206,
+ "learning_rate": 4.894770202204509e-06,
+ "loss": 0.0072,
+ "step": 138
+ },
+ {
+ "epoch": 1.62109375,
+ "grad_norm": 0.2945426404476166,
+ "learning_rate": 4.889200729037241e-06,
+ "loss": 0.0086,
+ "step": 139
+ },
+ {
+ "epoch": 1.6328125,
+ "grad_norm": 0.49699363112449646,
+ "learning_rate": 4.883490980137327e-06,
+ "loss": 0.0073,
+ "step": 140
+ },
+ {
+ "epoch": 1.64453125,
+ "grad_norm": 0.38112956285476685,
+ "learning_rate": 4.8776412907378845e-06,
+ "loss": 0.0056,
+ "step": 141
+ },
+ {
+ "epoch": 1.65625,
+ "grad_norm": 0.46780407428741455,
+ "learning_rate": 4.871652004288275e-06,
+ "loss": 0.0078,
+ "step": 142
+ },
+ {
+ "epoch": 1.66796875,
+ "grad_norm": 0.43764325976371765,
+ "learning_rate": 4.865523472433942e-06,
+ "loss": 0.005,
+ "step": 143
+ },
+ {
+ "epoch": 1.6796875,
+ "grad_norm": 0.3445664644241333,
+ "learning_rate": 4.859256054995758e-06,
+ "loss": 0.0069,
+ "step": 144
+ },
+ {
+ "epoch": 1.69140625,
+ "grad_norm": 0.40410447120666504,
+ "learning_rate": 4.8528501199489045e-06,
+ "loss": 0.0088,
+ "step": 145
+ },
+ {
+ "epoch": 1.703125,
+ "grad_norm": 0.5876736640930176,
+ "learning_rate": 4.846306043401268e-06,
+ "loss": 0.0057,
+ "step": 146
+ },
+ {
+ "epoch": 1.71484375,
+ "grad_norm": 0.5149250626564026,
+ "learning_rate": 4.839624209571352e-06,
+ "loss": 0.0056,
+ "step": 147
+ },
+ {
+ "epoch": 1.7265625,
+ "grad_norm": 0.7009180784225464,
+ "learning_rate": 4.832805010765724e-06,
+ "loss": 0.0088,
+ "step": 148
+ },
+ {
+ "epoch": 1.73828125,
+ "grad_norm": 0.42258334159851074,
+ "learning_rate": 4.8258488473559794e-06,
+ "loss": 0.004,
+ "step": 149
+ },
+ {
+ "epoch": 1.75,
+ "grad_norm": 0.39231887459754944,
+ "learning_rate": 4.8187561277552376e-06,
+ "loss": 0.005,
+ "step": 150
+ },
+ {
+ "epoch": 1.76171875,
+ "grad_norm": 0.3317432701587677,
+ "learning_rate": 4.811527268394157e-06,
+ "loss": 0.0038,
+ "step": 151
+ },
+ {
+ "epoch": 1.7734375,
+ "grad_norm": 0.5022267699241638,
+ "learning_rate": 4.804162693696494e-06,
+ "loss": 0.0056,
+ "step": 152
+ },
+ {
+ "epoch": 1.78515625,
+ "grad_norm": 0.39019322395324707,
+ "learning_rate": 4.796662836054176e-06,
+ "loss": 0.0053,
+ "step": 153
+ },
+ {
+ "epoch": 1.796875,
+ "grad_norm": 0.5674042701721191,
+ "learning_rate": 4.789028135801919e-06,
+ "loss": 0.007,
+ "step": 154
+ },
+ {
+ "epoch": 1.80859375,
+ "grad_norm": 0.5690024495124817,
+ "learning_rate": 4.7812590411913755e-06,
+ "loss": 0.0053,
+ "step": 155
+ },
+ {
+ "epoch": 1.8203125,
+ "grad_norm": 0.23775412142276764,
+ "learning_rate": 4.773356008364812e-06,
+ "loss": 0.0031,
+ "step": 156
+ },
+ {
+ "epoch": 1.83203125,
+ "grad_norm": 0.4698558747768402,
+ "learning_rate": 4.765319501328332e-06,
+ "loss": 0.0021,
+ "step": 157
+ },
+ {
+ "epoch": 1.84375,
+ "grad_norm": 0.21603639423847198,
+ "learning_rate": 4.757149991924633e-06,
+ "loss": 0.0046,
+ "step": 158
+ },
+ {
+ "epoch": 1.85546875,
+ "grad_norm": 0.33830726146698,
+ "learning_rate": 4.748847959805297e-06,
+ "loss": 0.0022,
+ "step": 159
+ },
+ {
+ "epoch": 1.8671875,
+ "grad_norm": 0.44919782876968384,
+ "learning_rate": 4.740413892402639e-06,
+ "loss": 0.0032,
+ "step": 160
+ },
+ {
+ "epoch": 1.87890625,
+ "grad_norm": 0.5119614601135254,
+ "learning_rate": 4.731848284901082e-06,
+ "loss": 0.006,
+ "step": 161
+ },
+ {
+ "epoch": 1.890625,
+ "grad_norm": 0.3875437080860138,
+ "learning_rate": 4.723151640208084e-06,
+ "loss": 0.0024,
+ "step": 162
+ },
+ {
+ "epoch": 1.90234375,
+ "grad_norm": 0.3179910182952881,
+ "learning_rate": 4.714324468924614e-06,
+ "loss": 0.0037,
+ "step": 163
+ },
+ {
+ "epoch": 1.9140625,
+ "grad_norm": 0.43395644426345825,
+ "learning_rate": 4.705367289315172e-06,
+ "loss": 0.0027,
+ "step": 164
+ },
+ {
+ "epoch": 1.92578125,
+ "grad_norm": 0.3703945577144623,
+ "learning_rate": 4.696280627277356e-06,
+ "loss": 0.0047,
+ "step": 165
+ },
+ {
+ "epoch": 1.9375,
+ "grad_norm": 0.2503529191017151,
+ "learning_rate": 4.687065016310996e-06,
+ "loss": 0.0052,
+ "step": 166
+ },
+ {
+ "epoch": 1.94921875,
+ "grad_norm": 0.3613075315952301,
+ "learning_rate": 4.6777209974868194e-06,
+ "loss": 0.0034,
+ "step": 167
+ },
+ {
+ "epoch": 1.9609375,
+ "grad_norm": 0.3578515350818634,
+ "learning_rate": 4.668249119414692e-06,
+ "loss": 0.0021,
+ "step": 168
+ },
+ {
+ "epoch": 1.97265625,
+ "grad_norm": 0.1784515529870987,
+ "learning_rate": 4.6586499382113985e-06,
+ "loss": 0.0018,
+ "step": 169
+ },
+ {
+ "epoch": 1.984375,
+ "grad_norm": 0.259198397397995,
+ "learning_rate": 4.648924017468003e-06,
+ "loss": 0.0009,
+ "step": 170
+ },
+ {
+ "epoch": 1.99609375,
+ "grad_norm": 0.7194133400917053,
+ "learning_rate": 4.6390719282167515e-06,
+ "loss": 0.0041,
+ "step": 171
+ },
+ {
+ "epoch": 2.0,
+ "grad_norm": 0.7194133400917053,
+ "learning_rate": 4.629094248897546e-06,
+ "loss": 0.0014,
+ "step": 172
+ },
+ {
+ "epoch": 2.01171875,
+ "grad_norm": 0.5032601952552795,
+ "learning_rate": 4.618991565323987e-06,
+ "loss": 0.0028,
+ "step": 173
+ },
+ {
+ "epoch": 2.0234375,
+ "grad_norm": 0.6387512683868408,
+ "learning_rate": 4.608764470648971e-06,
+ "loss": 0.0007,
+ "step": 174
+ },
+ {
+ "epoch": 2.03515625,
+ "grad_norm": 0.23177844285964966,
+ "learning_rate": 4.598413565329876e-06,
+ "loss": 0.0006,
+ "step": 175
+ },
+ {
+ "epoch": 2.046875,
+ "grad_norm": 0.1713147759437561,
+ "learning_rate": 4.587939457093296e-06,
+ "loss": 0.0003,
+ "step": 176
+ },
+ {
+ "epoch": 2.05859375,
+ "grad_norm": 0.06128697097301483,
+ "learning_rate": 4.577342760899368e-06,
+ "loss": 0.0001,
+ "step": 177
+ },
+ {
+ "epoch": 2.0703125,
+ "grad_norm": 0.538530170917511,
+ "learning_rate": 4.566624098905665e-06,
+ "loss": 0.0004,
+ "step": 178
+ },
+ {
+ "epoch": 2.08203125,
+ "grad_norm": 0.03301696106791496,
+ "learning_rate": 4.555784100430662e-06,
+ "loss": 0.0004,
+ "step": 179
+ },
+ {
+ "epoch": 2.09375,
+ "grad_norm": 0.21366432309150696,
+ "learning_rate": 4.544823401916794e-06,
+ "loss": 0.0014,
+ "step": 180
+ },
+ {
+ "epoch": 2.10546875,
+ "grad_norm": 0.13440090417861938,
+ "learning_rate": 4.533742646893086e-06,
+ "loss": 0.0004,
+ "step": 181
+ },
+ {
+ "epoch": 2.1171875,
+ "grad_norm": 0.531997799873352,
+ "learning_rate": 4.522542485937369e-06,
+ "loss": 0.0008,
+ "step": 182
+ },
+ {
+ "epoch": 2.12890625,
+ "grad_norm": 0.2832719385623932,
+ "learning_rate": 4.511223576638084e-06,
+ "loss": 0.0023,
+ "step": 183
+ },
+ {
+ "epoch": 2.140625,
+ "grad_norm": 0.3814002275466919,
+ "learning_rate": 4.499786583555675e-06,
+ "loss": 0.001,
+ "step": 184
+ },
+ {
+ "epoch": 2.15234375,
+ "grad_norm": 0.2522885501384735,
+ "learning_rate": 4.4882321781835666e-06,
+ "loss": 0.0004,
+ "step": 185
+ },
+ {
+ "epoch": 2.1640625,
+ "grad_norm": 0.3866797983646393,
+ "learning_rate": 4.476561038908745e-06,
+ "loss": 0.0007,
+ "step": 186
+ },
+ {
+ "epoch": 2.17578125,
+ "grad_norm": 0.2128417044878006,
+ "learning_rate": 4.464773850971924e-06,
+ "loss": 0.0001,
+ "step": 187
+ },
+ {
+ "epoch": 2.1875,
+ "grad_norm": 0.135880708694458,
+ "learning_rate": 4.452871306427314e-06,
+ "loss": 0.0031,
+ "step": 188
+ },
+ {
+ "epoch": 2.19921875,
+ "grad_norm": 0.38835451006889343,
+ "learning_rate": 4.440854104101988e-06,
+ "loss": 0.0015,
+ "step": 189
+ },
+ {
+ "epoch": 2.2109375,
+ "grad_norm": 0.18233123421669006,
+ "learning_rate": 4.428722949554858e-06,
+ "loss": 0.0001,
+ "step": 190
+ },
+ {
+ "epoch": 2.22265625,
+ "grad_norm": 0.10753051191568375,
+ "learning_rate": 4.416478555035241e-06,
+ "loss": 0.0017,
+ "step": 191
+ },
+ {
+ "epoch": 2.234375,
+ "grad_norm": 0.30138343572616577,
+ "learning_rate": 4.404121639441047e-06,
+ "loss": 0.0004,
+ "step": 192
+ },
+ {
+ "epoch": 2.24609375,
+ "grad_norm": 0.12771356105804443,
+ "learning_rate": 4.391652928276572e-06,
+ "loss": 0.0022,
+ "step": 193
+ },
+ {
+ "epoch": 2.2578125,
+ "grad_norm": 0.4173564612865448,
+ "learning_rate": 4.379073153609896e-06,
+ "loss": 0.0001,
+ "step": 194
+ },
+ {
+ "epoch": 2.26953125,
+ "grad_norm": 0.08329658955335617,
+ "learning_rate": 4.366383054029907e-06,
+ "loss": 0.0009,
+ "step": 195
+ },
+ {
+ "epoch": 2.28125,
+ "grad_norm": 0.21187439560890198,
+ "learning_rate": 4.3535833746029335e-06,
+ "loss": 0.0013,
+ "step": 196
+ },
+ {
+ "epoch": 2.29296875,
+ "grad_norm": 0.046030864119529724,
+ "learning_rate": 4.340674866829001e-06,
+ "loss": 0.0004,
+ "step": 197
+ },
+ {
+ "epoch": 2.3046875,
+ "grad_norm": 0.08373020589351654,
+ "learning_rate": 4.32765828859771e-06,
+ "loss": 0.0014,
+ "step": 198
+ },
+ {
+ "epoch": 2.31640625,
+ "grad_norm": 0.4026390314102173,
+ "learning_rate": 4.314534404143738e-06,
+ "loss": 0.0003,
+ "step": 199
+ },
+ {
+ "epoch": 2.328125,
+ "grad_norm": 0.24255593121051788,
+ "learning_rate": 4.3013039840019675e-06,
+ "loss": 0.0009,
+ "step": 200
+ },
+ {
+ "epoch": 2.33984375,
+ "grad_norm": 0.2282780110836029,
+ "learning_rate": 4.287967804962252e-06,
+ "loss": 0.0025,
+ "step": 201
+ },
+ {
+ "epoch": 2.3515625,
+ "grad_norm": 0.14743350446224213,
+ "learning_rate": 4.274526650023801e-06,
+ "loss": 0.0014,
+ "step": 202
+ },
+ {
+ "epoch": 2.36328125,
+ "grad_norm": 0.17971713840961456,
+ "learning_rate": 4.260981308349214e-06,
+ "loss": 0.0003,
+ "step": 203
+ },
+ {
+ "epoch": 2.375,
+ "grad_norm": 0.03872796148061752,
+ "learning_rate": 4.247332575218144e-06,
+ "loss": 0.0003,
+ "step": 204
+ },
+ {
+ "epoch": 2.38671875,
+ "grad_norm": 0.06636863946914673,
+ "learning_rate": 4.233581251980604e-06,
+ "loss": 0.0004,
+ "step": 205
+ },
+ {
+ "epoch": 2.3984375,
+ "grad_norm": 0.1254304051399231,
+ "learning_rate": 4.2197281460099245e-06,
+ "loss": 0.0002,
+ "step": 206
+ },
+ {
+ "epoch": 2.41015625,
+ "grad_norm": 0.03998701646924019,
+ "learning_rate": 4.2057740706553415e-06,
+ "loss": 0.0007,
+ "step": 207
+ },
+ {
+ "epoch": 2.421875,
+ "grad_norm": 0.8734745979309082,
+ "learning_rate": 4.191719845194246e-06,
+ "loss": 0.0019,
+ "step": 208
+ },
+ {
+ "epoch": 2.43359375,
+ "grad_norm": 0.34975236654281616,
+ "learning_rate": 4.177566294784085e-06,
+ "loss": 0.0006,
+ "step": 209
+ },
+ {
+ "epoch": 2.4453125,
+ "grad_norm": 0.07566183060407639,
+ "learning_rate": 4.163314250413913e-06,
+ "loss": 0.0003,
+ "step": 210
+ },
+ {
+ "epoch": 2.45703125,
+ "grad_norm": 0.09056711941957474,
+ "learning_rate": 4.148964548855603e-06,
+ "loss": 0.0002,
+ "step": 211
+ },
+ {
+ "epoch": 2.46875,
+ "grad_norm": 0.16160684823989868,
+ "learning_rate": 4.134518032614713e-06,
+ "loss": 0.0009,
+ "step": 212
+ },
+ {
+ "epoch": 2.48046875,
+ "grad_norm": 0.0812753438949585,
+ "learning_rate": 4.119975549881029e-06,
+ "loss": 0.0002,
+ "step": 213
+ },
+ {
+ "epoch": 2.4921875,
+ "grad_norm": 0.05827738344669342,
+ "learning_rate": 4.105337954478756e-06,
+ "loss": 0.0007,
+ "step": 214
+ },
+ {
+ "epoch": 2.50390625,
+ "grad_norm": 0.2625848054885864,
+ "learning_rate": 4.0906061058164e-06,
+ "loss": 0.0003,
+ "step": 215
+ },
+ {
+ "epoch": 2.515625,
+ "grad_norm": 0.1771923154592514,
+ "learning_rate": 4.075780868836296e-06,
+ "loss": 0.0005,
+ "step": 216
+ },
+ {
+ "epoch": 2.52734375,
+ "grad_norm": 0.034166041761636734,
+ "learning_rate": 4.060863113963835e-06,
+ "loss": 0.0012,
+ "step": 217
+ },
+ {
+ "epoch": 2.5390625,
+ "grad_norm": 0.14099521934986115,
+ "learning_rate": 4.045853717056358e-06,
+ "loss": 0.0,
+ "step": 218
+ },
+ {
+ "epoch": 2.55078125,
+ "grad_norm": 0.34704917669296265,
+ "learning_rate": 4.030753559351728e-06,
+ "loss": 0.0006,
+ "step": 219
+ },
+ {
+ "epoch": 2.5625,
+ "grad_norm": 0.25681111216545105,
+ "learning_rate": 4.015563527416596e-06,
+ "loss": 0.0004,
+ "step": 220
+ },
+ {
+ "epoch": 2.57421875,
+ "grad_norm": 0.36212408542633057,
+ "learning_rate": 4.000284513094342e-06,
+ "loss": 0.0003,
+ "step": 221
+ },
+ {
+ "epoch": 2.5859375,
+ "grad_norm": 0.13945375382900238,
+ "learning_rate": 3.984917413452721e-06,
+ "loss": 0.0001,
+ "step": 222
+ },
+ {
+ "epoch": 2.59765625,
+ "grad_norm": 0.06798060238361359,
+ "learning_rate": 3.969463130731183e-06,
+ "loss": 0.0007,
+ "step": 223
+ },
+ {
+ "epoch": 2.609375,
+ "grad_norm": 0.19848179817199707,
+ "learning_rate": 3.953922572287915e-06,
+ "loss": 0.0007,
+ "step": 224
+ },
+ {
+ "epoch": 2.62109375,
+ "grad_norm": 0.5454645156860352,
+ "learning_rate": 3.938296650546552e-06,
+ "loss": 0.0018,
+ "step": 225
+ },
+ {
+ "epoch": 2.6328125,
+ "grad_norm": 0.22043731808662415,
+ "learning_rate": 3.9225862829426184e-06,
+ "loss": 0.0036,
+ "step": 226
+ },
+ {
+ "epoch": 2.64453125,
+ "grad_norm": 0.3086087107658386,
+ "learning_rate": 3.906792391869657e-06,
+ "loss": 0.0002,
+ "step": 227
+ },
+ {
+ "epoch": 2.65625,
+ "grad_norm": 0.04387599974870682,
+ "learning_rate": 3.890915904625075e-06,
+ "loss": 0.0014,
+ "step": 228
+ },
+ {
+ "epoch": 2.66796875,
+ "grad_norm": 0.3786030113697052,
+ "learning_rate": 3.874957753355701e-06,
+ "loss": 0.0014,
+ "step": 229
+ },
+ {
+ "epoch": 2.6796875,
+ "grad_norm": 0.28310713171958923,
+ "learning_rate": 3.858918875003053e-06,
+ "loss": 0.0001,
+ "step": 230
+ },
+ {
+ "epoch": 2.69140625,
+ "grad_norm": 0.0586460717022419,
+ "learning_rate": 3.842800211248333e-06,
+ "loss": 0.0001,
+ "step": 231
+ },
+ {
+ "epoch": 2.703125,
+ "grad_norm": 0.11408677697181702,
+ "learning_rate": 3.8266027084571335e-06,
+ "loss": 0.001,
+ "step": 232
+ },
+ {
+ "epoch": 2.71484375,
+ "grad_norm": 0.06875021010637283,
+ "learning_rate": 3.810327317623881e-06,
+ "loss": 0.0001,
+ "step": 233
+ },
+ {
+ "epoch": 2.7265625,
+ "grad_norm": 0.037388525903224945,
+ "learning_rate": 3.793974994315991e-06,
+ "loss": 0.0002,
+ "step": 234
+ },
+ {
+ "epoch": 2.73828125,
+ "grad_norm": 0.041430581361055374,
+ "learning_rate": 3.7775466986177763e-06,
+ "loss": 0.0015,
+ "step": 235
+ },
+ {
+ "epoch": 2.75,
+ "grad_norm": 0.26019373536109924,
+ "learning_rate": 3.7610433950740667e-06,
+ "loss": 0.0022,
+ "step": 236
+ },
+ {
+ "epoch": 2.76171875,
+ "grad_norm": 0.16638831794261932,
+ "learning_rate": 3.7444660526335853e-06,
+ "loss": 0.0001,
+ "step": 237
+ },
+ {
+ "epoch": 2.7734375,
+ "grad_norm": 0.11822371184825897,
+ "learning_rate": 3.7278156445920584e-06,
+ "loss": 0.0004,
+ "step": 238
+ },
+ {
+ "epoch": 2.78515625,
+ "grad_norm": 0.055076126009225845,
+ "learning_rate": 3.711093148535068e-06,
+ "loss": 0.0001,
+ "step": 239
+ },
+ {
+ "epoch": 2.796875,
+ "grad_norm": 0.08209875971078873,
+ "learning_rate": 3.6942995462806574e-06,
+ "loss": 0.0012,
+ "step": 240
+ },
+ {
+ "epoch": 2.80859375,
+ "grad_norm": 0.10523220896720886,
+ "learning_rate": 3.6774358238216878e-06,
+ "loss": 0.0004,
+ "step": 241
+ },
+ {
+ "epoch": 2.8203125,
+ "grad_norm": 0.09211058169603348,
+ "learning_rate": 3.660502971267945e-06,
+ "loss": 0.0007,
+ "step": 242
+ },
+ {
+ "epoch": 2.83203125,
+ "grad_norm": 0.6209844946861267,
+ "learning_rate": 3.6435019827880093e-06,
+ "loss": 0.0004,
+ "step": 243
+ },
+ {
+ "epoch": 2.84375,
+ "grad_norm": 0.030900023877620697,
+ "learning_rate": 3.626433856550886e-06,
+ "loss": 0.0002,
+ "step": 244
+ },
+ {
+ "epoch": 2.85546875,
+ "grad_norm": 0.041130077093839645,
+ "learning_rate": 3.6092995946673996e-06,
+ "loss": 0.0003,
+ "step": 245
+ },
+ {
+ "epoch": 2.8671875,
+ "grad_norm": 0.052536819130182266,
+ "learning_rate": 3.5921002031313586e-06,
+ "loss": 0.0001,
+ "step": 246
+ },
+ {
+ "epoch": 2.87890625,
+ "grad_norm": 0.027478178963065147,
+ "learning_rate": 3.574836691760489e-06,
+ "loss": 0.0011,
+ "step": 247
+ },
+ {
+ "epoch": 2.890625,
+ "grad_norm": 0.11695867031812668,
+ "learning_rate": 3.557510074137147e-06,
+ "loss": 0.0002,
+ "step": 248
+ },
+ {
+ "epoch": 2.90234375,
+ "grad_norm": 0.08782754838466644,
+ "learning_rate": 3.540121367548811e-06,
+ "loss": 0.001,
+ "step": 249
+ },
+ {
+ "epoch": 2.9140625,
+ "grad_norm": 0.19123269617557526,
+ "learning_rate": 3.5226715929283507e-06,
+ "loss": 0.0001,
+ "step": 250
+ },
+ {
+ "epoch": 2.92578125,
+ "grad_norm": 0.020774945616722107,
+ "learning_rate": 3.505161774794085e-06,
+ "loss": 0.0006,
+ "step": 251
+ },
+ {
+ "epoch": 2.9375,
+ "grad_norm": 0.12062892317771912,
+ "learning_rate": 3.487592941189636e-06,
+ "loss": 0.0001,
+ "step": 252
+ },
+ {
+ "epoch": 2.94921875,
+ "grad_norm": 0.013076180592179298,
+ "learning_rate": 3.469966123623563e-06,
+ "loss": 0.0011,
+ "step": 253
+ },
+ {
+ "epoch": 2.9609375,
+ "grad_norm": 0.22065430879592896,
+ "learning_rate": 3.4522823570088073e-06,
+ "loss": 0.0001,
+ "step": 254
+ },
+ {
+ "epoch": 2.97265625,
+ "grad_norm": 0.027459079399704933,
+ "learning_rate": 3.434542679601922e-06,
+ "loss": 0.0003,
+ "step": 255
+ },
+ {
+ "epoch": 2.984375,
+ "grad_norm": 0.07469172775745392,
+ "learning_rate": 3.4167481329421204e-06,
+ "loss": 0.0005,
+ "step": 256
+ },
+ {
+ "epoch": 2.99609375,
+ "grad_norm": 0.544292688369751,
+ "learning_rate": 3.39889976179012e-06,
+ "loss": 0.0001,
+ "step": 257
+ },
+ {
+ "epoch": 3.0,
+ "grad_norm": 0.02610701508820057,
+ "learning_rate": 3.380998614066805e-06,
+ "loss": 0.0,
+ "step": 258
+ },
+ {
+ "epoch": 3.01171875,
+ "grad_norm": 0.016433028504252434,
+ "learning_rate": 3.363045740791698e-06,
+ "loss": 0.0,
+ "step": 259
+ },
+ {
+ "epoch": 3.0234375,
+ "grad_norm": 0.009407744742929935,
+ "learning_rate": 3.345042196021257e-06,
+ "loss": 0.0,
+ "step": 260
+ },
+ {
+ "epoch": 3.03515625,
+ "grad_norm": 0.009587760083377361,
+ "learning_rate": 3.326989036786981e-06,
+ "loss": 0.0,
+ "step": 261
+ },
+ {
+ "epoch": 3.046875,
+ "grad_norm": 0.021458568051457405,
+ "learning_rate": 3.3088873230333562e-06,
+ "loss": 0.0001,
+ "step": 262
+ },
+ {
+ "epoch": 3.05859375,
+ "grad_norm": 1.3090940713882446,
+ "learning_rate": 3.290738117555622e-06,
+ "loss": 0.0007,
+ "step": 263
+ },
+ {
+ "epoch": 3.0703125,
+ "grad_norm": 0.008000005036592484,
+ "learning_rate": 3.272542485937369e-06,
+ "loss": 0.0,
+ "step": 264
+ },
+ {
+ "epoch": 3.08203125,
+ "grad_norm": 0.11048968136310577,
+ "learning_rate": 3.2543014964879814e-06,
+ "loss": 0.0004,
+ "step": 265
+ },
+ {
+ "epoch": 3.09375,
+ "grad_norm": 0.010688518173992634,
+ "learning_rate": 3.2360162201799085e-06,
+ "loss": 0.0,
+ "step": 266
+ },
+ {
+ "epoch": 3.10546875,
+ "grad_norm": 0.0585443377494812,
+ "learning_rate": 3.21768773058579e-06,
+ "loss": 0.0001,
+ "step": 267
+ },
+ {
+ "epoch": 3.1171875,
+ "grad_norm": 0.12098421901464462,
+ "learning_rate": 3.1993171038154203e-06,
+ "loss": 0.0002,
+ "step": 268
+ },
+ {
+ "epoch": 3.12890625,
+ "grad_norm": 0.01194986142218113,
+ "learning_rate": 3.180905418452569e-06,
+ "loss": 0.0,
+ "step": 269
+ },
+ {
+ "epoch": 3.140625,
+ "grad_norm": 0.0898946076631546,
+ "learning_rate": 3.162453755491655e-06,
+ "loss": 0.0011,
+ "step": 270
+ },
+ {
+ "epoch": 3.15234375,
+ "grad_norm": 0.04248907417058945,
+ "learning_rate": 3.143963198274278e-06,
+ "loss": 0.0001,
+ "step": 271
+ },
+ {
+ "epoch": 3.1640625,
+ "grad_norm": 0.11775418370962143,
+ "learning_rate": 3.125434832425613e-06,
+ "loss": 0.0002,
+ "step": 272
+ },
+ {
+ "epoch": 3.17578125,
+ "grad_norm": 0.009955376386642456,
+ "learning_rate": 3.1068697457906736e-06,
+ "loss": 0.0,
+ "step": 273
+ },
+ {
+ "epoch": 3.1875,
+ "grad_norm": 0.010195266455411911,
+ "learning_rate": 3.0882690283704355e-06,
+ "loss": 0.0,
+ "step": 274
+ },
+ {
+ "epoch": 3.19921875,
+ "grad_norm": 0.0036824019625782967,
+ "learning_rate": 3.0696337722578444e-06,
+ "loss": 0.0,
+ "step": 275
+ },
+ {
+ "epoch": 3.2109375,
+ "grad_norm": 0.004132798407226801,
+ "learning_rate": 3.0509650715736977e-06,
+ "loss": 0.0,
+ "step": 276
+ },
+ {
+ "epoch": 3.22265625,
+ "grad_norm": 0.0651523619890213,
+ "learning_rate": 3.0322640224024024e-06,
+ "loss": 0.0001,
+ "step": 277
+ },
+ {
+ "epoch": 3.234375,
+ "grad_norm": 0.015174048021435738,
+ "learning_rate": 3.0135317227276247e-06,
+ "loss": 0.0,
+ "step": 278
+ },
+ {
+ "epoch": 3.24609375,
+ "grad_norm": 0.004420771263539791,
+ "learning_rate": 2.994769272367822e-06,
+ "loss": 0.0,
+ "step": 279
+ },
+ {
+ "epoch": 3.2578125,
+ "grad_norm": 0.019537663087248802,
+ "learning_rate": 2.975977772911671e-06,
+ "loss": 0.0001,
+ "step": 280
+ },
+ {
+ "epoch": 3.26953125,
+ "grad_norm": 0.005312444642186165,
+ "learning_rate": 2.9571583276533923e-06,
+ "loss": 0.0,
+ "step": 281
+ },
+ {
+ "epoch": 3.28125,
+ "grad_norm": 0.005001228302717209,
+ "learning_rate": 2.93831204152797e-06,
+ "loss": 0.0,
+ "step": 282
+ },
+ {
+ "epoch": 3.29296875,
+ "grad_norm": 0.02515912428498268,
+ "learning_rate": 2.9194400210462808e-06,
+ "loss": 0.0,
+ "step": 283
+ },
+ {
+ "epoch": 3.3046875,
+ "grad_norm": 0.0026461018715053797,
+ "learning_rate": 2.9005433742301274e-06,
+ "loss": 0.0,
+ "step": 284
+ },
+ {
+ "epoch": 3.31640625,
+ "grad_norm": 0.008561859838664532,
+ "learning_rate": 2.8816232105471864e-06,
+ "loss": 0.0,
+ "step": 285
+ },
+ {
+ "epoch": 3.328125,
+ "grad_norm": 0.0016494860174134374,
+ "learning_rate": 2.8626806408458626e-06,
+ "loss": 0.0,
+ "step": 286
+ },
+ {
+ "epoch": 3.33984375,
+ "grad_norm": 0.13021136820316315,
+ "learning_rate": 2.843716777290074e-06,
+ "loss": 0.0007,
+ "step": 287
+ },
+ {
+ "epoch": 3.3515625,
+ "grad_norm": 0.0030203904025256634,
+ "learning_rate": 2.8247327332939512e-06,
+ "loss": 0.0,
+ "step": 288
+ },
+ {
+ "epoch": 3.36328125,
+ "grad_norm": 0.03953886777162552,
+ "learning_rate": 2.805729623456469e-06,
+ "loss": 0.0,
+ "step": 289
+ },
+ {
+ "epoch": 3.375,
+ "grad_norm": 0.016400372609496117,
+ "learning_rate": 2.786708563496002e-06,
+ "loss": 0.0,
+ "step": 290
+ },
+ {
+ "epoch": 3.38671875,
+ "grad_norm": 0.0036580052692443132,
+ "learning_rate": 2.7676706701848187e-06,
+ "loss": 0.0,
+ "step": 291
+ },
+ {
+ "epoch": 3.3984375,
+ "grad_norm": 0.013516291044652462,
+ "learning_rate": 2.748617061283518e-06,
+ "loss": 0.0,
+ "step": 292
+ },
+ {
+ "epoch": 3.41015625,
+ "grad_norm": 0.0161955077201128,
+ "learning_rate": 2.7295488554753957e-06,
+ "loss": 0.0,
+ "step": 293
+ },
+ {
+ "epoch": 3.421875,
+ "grad_norm": 0.030412085354328156,
+ "learning_rate": 2.710467172300768e-06,
+ "loss": 0.0,
+ "step": 294
+ },
+ {
+ "epoch": 3.43359375,
+ "grad_norm": 0.009741670452058315,
+ "learning_rate": 2.69137313209124e-06,
+ "loss": 0.0,
+ "step": 295
+ },
+ {
+ "epoch": 3.4453125,
+ "grad_norm": 0.0022640388924628496,
+ "learning_rate": 2.672267855903927e-06,
+ "loss": 0.0,
+ "step": 296
+ },
+ {
+ "epoch": 3.45703125,
+ "grad_norm": 0.004546131007373333,
+ "learning_rate": 2.653152465455639e-06,
+ "loss": 0.0,
+ "step": 297
+ },
+ {
+ "epoch": 3.46875,
+ "grad_norm": 0.00977818388491869,
+ "learning_rate": 2.6340280830570142e-06,
+ "loss": 0.0,
+ "step": 298
+ },
+ {
+ "epoch": 3.48046875,
+ "grad_norm": 0.00292399013414979,
+ "learning_rate": 2.614895831546633e-06,
+ "loss": 0.0,
+ "step": 299
+ },
+ {
+ "epoch": 3.4921875,
+ "grad_norm": 0.02362428605556488,
+ "learning_rate": 2.595756834225089e-06,
+ "loss": 0.0001,
+ "step": 300
+ },
+ {
+ "epoch": 3.50390625,
+ "grad_norm": 0.05170333385467529,
+ "learning_rate": 2.576612214789039e-06,
+ "loss": 0.0001,
+ "step": 301
+ },
+ {
+ "epoch": 3.515625,
+ "grad_norm": 0.002428271807730198,
+ "learning_rate": 2.5574630972652263e-06,
+ "loss": 0.0,
+ "step": 302
+ },
+ {
+ "epoch": 3.52734375,
+ "grad_norm": 0.0020236221607774496,
+ "learning_rate": 2.538310605944491e-06,
+ "loss": 0.0,
+ "step": 303
+ },
+ {
+ "epoch": 3.5390625,
+ "grad_norm": 0.0026413940358906984,
+ "learning_rate": 2.5191558653157542e-06,
+ "loss": 0.0,
+ "step": 304
+ },
+ {
+ "epoch": 3.55078125,
+ "grad_norm": 0.001937767956405878,
+ "learning_rate": 2.5e-06,
+ "loss": 0.0,
+ "step": 305
+ },
+ {
+ "epoch": 3.5625,
+ "grad_norm": 0.013072842732071877,
+ "learning_rate": 2.480844134684246e-06,
+ "loss": 0.0,
+ "step": 306
+ },
+ {
+ "epoch": 3.57421875,
+ "grad_norm": 0.07046481966972351,
+ "learning_rate": 2.4616893940555094e-06,
+ "loss": 0.0003,
+ "step": 307
+ },
+ {
+ "epoch": 3.5859375,
+ "grad_norm": 0.002507950412109494,
+ "learning_rate": 2.4425369027347746e-06,
+ "loss": 0.0,
+ "step": 308
+ },
+ {
+ "epoch": 3.59765625,
+ "grad_norm": 0.0024932159576565027,
+ "learning_rate": 2.423387785210962e-06,
+ "loss": 0.0,
+ "step": 309
+ },
+ {
+ "epoch": 3.609375,
+ "grad_norm": 0.007839293219149113,
+ "learning_rate": 2.404243165774912e-06,
+ "loss": 0.0,
+ "step": 310
+ },
+ {
+ "epoch": 3.62109375,
+ "grad_norm": 0.008749544620513916,
+ "learning_rate": 2.3851041684533677e-06,
+ "loss": 0.0,
+ "step": 311
+ },
+ {
+ "epoch": 3.6328125,
+ "grad_norm": 0.00224123802036047,
+ "learning_rate": 2.3659719169429866e-06,
+ "loss": 0.0,
+ "step": 312
+ },
+ {
+ "epoch": 3.64453125,
+ "grad_norm": 0.0036495248787105083,
+ "learning_rate": 2.346847534544362e-06,
+ "loss": 0.0,
+ "step": 313
+ },
+ {
+ "epoch": 3.65625,
+ "grad_norm": 0.008617470040917397,
+ "learning_rate": 2.3277321440960733e-06,
+ "loss": 0.0,
+ "step": 314
+ },
+ {
+ "epoch": 3.66796875,
+ "grad_norm": 0.20711803436279297,
+ "learning_rate": 2.308626867908761e-06,
+ "loss": 0.0004,
+ "step": 315
+ },
+ {
+ "epoch": 3.6796875,
+ "grad_norm": 0.002029536757618189,
+ "learning_rate": 2.2895328276992325e-06,
+ "loss": 0.0,
+ "step": 316
+ },
+ {
+ "epoch": 3.69140625,
+ "grad_norm": 0.0029692472890019417,
+ "learning_rate": 2.270451144524605e-06,
+ "loss": 0.0,
+ "step": 317
+ },
+ {
+ "epoch": 3.703125,
+ "grad_norm": 0.003482841420918703,
+ "learning_rate": 2.251382938716482e-06,
+ "loss": 0.0,
+ "step": 318
+ },
+ {
+ "epoch": 3.71484375,
+ "grad_norm": 0.004736272618174553,
+ "learning_rate": 2.2323293298151817e-06,
+ "loss": 0.0,
+ "step": 319
+ },
+ {
+ "epoch": 3.7265625,
+ "grad_norm": 0.002524860203266144,
+ "learning_rate": 2.2132914365039993e-06,
+ "loss": 0.0,
+ "step": 320
+ },
+ {
+ "epoch": 3.73828125,
+ "grad_norm": 0.0024032641667872667,
+ "learning_rate": 2.1942703765435317e-06,
+ "loss": 0.0,
+ "step": 321
+ },
+ {
+ "epoch": 3.75,
+ "grad_norm": 0.06402894109487534,
+ "learning_rate": 2.1752672667060488e-06,
+ "loss": 0.0002,
+ "step": 322
+ },
+ {
+ "epoch": 3.76171875,
+ "grad_norm": 0.0013841127511113882,
+ "learning_rate": 2.1562832227099266e-06,
+ "loss": 0.0,
+ "step": 323
+ },
+ {
+ "epoch": 3.7734375,
+ "grad_norm": 0.002198501257225871,
+ "learning_rate": 2.137319359154138e-06,
+ "loss": 0.0,
+ "step": 324
+ },
+ {
+ "epoch": 3.78515625,
+ "grad_norm": 0.004288461524993181,
+ "learning_rate": 2.1183767894528135e-06,
+ "loss": 0.0,
+ "step": 325
+ },
+ {
+ "epoch": 3.796875,
+ "grad_norm": 0.16602352261543274,
+ "learning_rate": 2.099456625769872e-06,
+ "loss": 0.0003,
+ "step": 326
+ },
+ {
+ "epoch": 3.80859375,
+ "grad_norm": 0.001620235969312489,
+ "learning_rate": 2.08055997895372e-06,
+ "loss": 0.0,
+ "step": 327
+ },
+ {
+ "epoch": 3.8203125,
+ "grad_norm": 0.004387021530419588,
+ "learning_rate": 2.0616879584720305e-06,
+ "loss": 0.0,
+ "step": 328
+ },
+ {
+ "epoch": 3.83203125,
+ "grad_norm": 0.040472231805324554,
+ "learning_rate": 2.042841672346608e-06,
+ "loss": 0.0001,
+ "step": 329
+ },
+ {
+ "epoch": 3.84375,
+ "grad_norm": 0.03627858683466911,
+ "learning_rate": 2.024022227088329e-06,
+ "loss": 0.0001,
+ "step": 330
+ },
+ {
+ "epoch": 3.85546875,
+ "grad_norm": 0.0029672810342162848,
+ "learning_rate": 2.0052307276321793e-06,
+ "loss": 0.0,
+ "step": 331
+ },
+ {
+ "epoch": 3.8671875,
+ "grad_norm": 0.0023526407312601805,
+ "learning_rate": 1.9864682772723757e-06,
+ "loss": 0.0,
+ "step": 332
+ },
+ {
+ "epoch": 3.87890625,
+ "grad_norm": 0.001383278169669211,
+ "learning_rate": 1.967735977597598e-06,
+ "loss": 0.0,
+ "step": 333
+ },
+ {
+ "epoch": 3.890625,
+ "grad_norm": 0.002337483922019601,
+ "learning_rate": 1.9490349284263036e-06,
+ "loss": 0.0,
+ "step": 334
+ },
+ {
+ "epoch": 3.90234375,
+ "grad_norm": 0.02629532851278782,
+ "learning_rate": 1.930366227742157e-06,
+ "loss": 0.0,
+ "step": 335
+ },
+ {
+ "epoch": 3.9140625,
+ "grad_norm": 0.03508671000599861,
+ "learning_rate": 1.9117309716295658e-06,
+ "loss": 0.0001,
+ "step": 336
+ },
+ {
+ "epoch": 3.92578125,
+ "grad_norm": 0.0021862757857888937,
+ "learning_rate": 1.8931302542093274e-06,
+ "loss": 0.0,
+ "step": 337
+ },
+ {
+ "epoch": 3.9375,
+ "grad_norm": 0.002468815306201577,
+ "learning_rate": 1.8745651675743876e-06,
+ "loss": 0.0,
+ "step": 338
+ },
+ {
+ "epoch": 3.94921875,
+ "grad_norm": 0.028530335053801537,
+ "learning_rate": 1.8560368017257229e-06,
+ "loss": 0.0001,
+ "step": 339
+ },
+ {
+ "epoch": 3.9609375,
+ "grad_norm": 0.004602192435413599,
+ "learning_rate": 1.8375462445083464e-06,
+ "loss": 0.0,
+ "step": 340
+ },
+ {
+ "epoch": 3.97265625,
+ "grad_norm": 0.004955258686095476,
+ "learning_rate": 1.8190945815474323e-06,
+ "loss": 0.0,
+ "step": 341
+ },
+ {
+ "epoch": 3.984375,
+ "grad_norm": 0.0018305755220353603,
+ "learning_rate": 1.8006828961845807e-06,
+ "loss": 0.0,
+ "step": 342
+ },
+ {
+ "epoch": 3.99609375,
+ "grad_norm": 0.004913098178803921,
+ "learning_rate": 1.782312269414211e-06,
+ "loss": 0.0,
+ "step": 343
+ },
+ {
+ "epoch": 4.0,
+ "grad_norm": 0.004913098178803921,
+ "learning_rate": 1.7639837798200923e-06,
+ "loss": 0.0,
+ "step": 344
+ },
+ {
+ "epoch": 4.01171875,
+ "grad_norm": 0.004227044992148876,
+ "learning_rate": 1.7456985035120194e-06,
+ "loss": 0.0,
+ "step": 345
+ },
+ {
+ "epoch": 4.0234375,
+ "grad_norm": 0.0020636608824133873,
+ "learning_rate": 1.7274575140626318e-06,
+ "loss": 0.0,
+ "step": 346
+ },
+ {
+ "epoch": 4.03515625,
+ "grad_norm": 0.010954855009913445,
+ "learning_rate": 1.709261882444379e-06,
+ "loss": 0.0,
+ "step": 347
+ },
+ {
+ "epoch": 4.046875,
+ "grad_norm": 0.021605566143989563,
+ "learning_rate": 1.6911126769666442e-06,
+ "loss": 0.0,
+ "step": 348
+ },
+ {
+ "epoch": 4.05859375,
+ "grad_norm": 0.003982124850153923,
+ "learning_rate": 1.6730109632130199e-06,
+ "loss": 0.0,
+ "step": 349
+ },
+ {
+ "epoch": 4.0703125,
+ "grad_norm": 0.019241735339164734,
+ "learning_rate": 1.6549578039787436e-06,
+ "loss": 0.0001,
+ "step": 350
+ },
+ {
+ "epoch": 4.08203125,
+ "grad_norm": 0.001743687316775322,
+ "learning_rate": 1.636954259208302e-06,
+ "loss": 0.0,
+ "step": 351
+ },
+ {
+ "epoch": 4.09375,
+ "grad_norm": 0.0027647230308502913,
+ "learning_rate": 1.6190013859331958e-06,
+ "loss": 0.0,
+ "step": 352
+ },
+ {
+ "epoch": 4.10546875,
+ "grad_norm": 0.001913967658765614,
+ "learning_rate": 1.6011002382098806e-06,
+ "loss": 0.0,
+ "step": 353
+ },
+ {
+ "epoch": 4.1171875,
+ "grad_norm": 0.0065271588973701,
+ "learning_rate": 1.5832518670578802e-06,
+ "loss": 0.0,
+ "step": 354
+ },
+ {
+ "epoch": 4.12890625,
+ "grad_norm": 0.0030666873790323734,
+ "learning_rate": 1.5654573203980782e-06,
+ "loss": 0.0,
+ "step": 355
+ },
+ {
+ "epoch": 4.140625,
+ "grad_norm": 0.006997556425631046,
+ "learning_rate": 1.5477176429911934e-06,
+ "loss": 0.0,
+ "step": 356
+ },
+ {
+ "epoch": 4.15234375,
+ "grad_norm": 0.0015223983209580183,
+ "learning_rate": 1.5300338763764371e-06,
+ "loss": 0.0,
+ "step": 357
+ },
+ {
+ "epoch": 4.1640625,
+ "grad_norm": 0.0016171627212315798,
+ "learning_rate": 1.5124070588103648e-06,
+ "loss": 0.0,
+ "step": 358
+ },
+ {
+ "epoch": 4.17578125,
+ "grad_norm": 0.001240705605596304,
+ "learning_rate": 1.4948382252059158e-06,
+ "loss": 0.0,
+ "step": 359
+ },
+ {
+ "epoch": 4.1875,
+ "grad_norm": 0.001194652053527534,
+ "learning_rate": 1.4773284070716504e-06,
+ "loss": 0.0,
+ "step": 360
+ },
+ {
+ "epoch": 4.19921875,
+ "grad_norm": 0.0016382395988330245,
+ "learning_rate": 1.4598786324511892e-06,
+ "loss": 0.0,
+ "step": 361
+ },
+ {
+ "epoch": 4.2109375,
+ "grad_norm": 0.004216539673507214,
+ "learning_rate": 1.4424899258628533e-06,
+ "loss": 0.0,
+ "step": 362
+ },
+ {
+ "epoch": 4.22265625,
+ "grad_norm": 0.0015016852412372828,
+ "learning_rate": 1.4251633082395117e-06,
+ "loss": 0.0,
+ "step": 363
+ },
+ {
+ "epoch": 4.234375,
+ "grad_norm": 0.002159053459763527,
+ "learning_rate": 1.4078997968686425e-06,
+ "loss": 0.0,
+ "step": 364
+ },
+ {
+ "epoch": 4.24609375,
+ "grad_norm": 0.0026948200538754463,
+ "learning_rate": 1.3907004053326006e-06,
+ "loss": 0.0,
+ "step": 365
+ },
+ {
+ "epoch": 4.2578125,
+ "grad_norm": 0.0025678593665361404,
+ "learning_rate": 1.373566143449115e-06,
+ "loss": 0.0,
+ "step": 366
+ },
+ {
+ "epoch": 4.26953125,
+ "grad_norm": 0.0020545010920614004,
+ "learning_rate": 1.3564980172119913e-06,
+ "loss": 0.0,
+ "step": 367
+ },
+ {
+ "epoch": 4.28125,
+ "grad_norm": 0.004045852459967136,
+ "learning_rate": 1.3394970287320553e-06,
+ "loss": 0.0,
+ "step": 368
+ },
+ {
+ "epoch": 4.29296875,
+ "grad_norm": 0.005362195894122124,
+ "learning_rate": 1.3225641761783126e-06,
+ "loss": 0.0,
+ "step": 369
+ },
+ {
+ "epoch": 4.3046875,
+ "grad_norm": 0.17514361441135406,
+ "learning_rate": 1.3057004537193424e-06,
+ "loss": 0.0002,
+ "step": 370
+ },
+ {
+ "epoch": 4.31640625,
+ "grad_norm": 0.002735719783231616,
+ "learning_rate": 1.2889068514649328e-06,
+ "loss": 0.0,
+ "step": 371
+ },
+ {
+ "epoch": 4.328125,
+ "grad_norm": 0.00350527698174119,
+ "learning_rate": 1.2721843554079418e-06,
+ "loss": 0.0,
+ "step": 372
+ },
+ {
+ "epoch": 4.33984375,
+ "grad_norm": 0.0011345328530296683,
+ "learning_rate": 1.2555339473664151e-06,
+ "loss": 0.0,
+ "step": 373
+ },
+ {
+ "epoch": 4.3515625,
+ "grad_norm": 0.01445677224546671,
+ "learning_rate": 1.238956604925934e-06,
+ "loss": 0.0,
+ "step": 374
+ },
+ {
+ "epoch": 4.36328125,
+ "grad_norm": 0.026896534487605095,
+ "learning_rate": 1.2224533013822237e-06,
+ "loss": 0.0,
+ "step": 375
+ },
+ {
+ "epoch": 4.375,
+ "grad_norm": 0.0032852741423994303,
+ "learning_rate": 1.206025005684009e-06,
+ "loss": 0.0,
+ "step": 376
+ },
+ {
+ "epoch": 4.38671875,
+ "grad_norm": 0.0014451753813773394,
+ "learning_rate": 1.1896726823761195e-06,
+ "loss": 0.0,
+ "step": 377
+ },
+ {
+ "epoch": 4.3984375,
+ "grad_norm": 0.002901519648730755,
+ "learning_rate": 1.1733972915428665e-06,
+ "loss": 0.0,
+ "step": 378
+ },
+ {
+ "epoch": 4.41015625,
+ "grad_norm": 0.001758516882546246,
+ "learning_rate": 1.1571997887516672e-06,
+ "loss": 0.0,
+ "step": 379
+ },
+ {
+ "epoch": 4.421875,
+ "grad_norm": 0.001257935306057334,
+ "learning_rate": 1.1410811249969475e-06,
+ "loss": 0.0,
+ "step": 380
+ },
+ {
+ "epoch": 4.43359375,
+ "grad_norm": 0.0016046202508732677,
+ "learning_rate": 1.1250422466442992e-06,
+ "loss": 0.0,
+ "step": 381
+ },
+ {
+ "epoch": 4.4453125,
+ "grad_norm": 0.0011374271707609296,
+ "learning_rate": 1.1090840953749253e-06,
+ "loss": 0.0,
+ "step": 382
+ },
+ {
+ "epoch": 4.45703125,
+ "grad_norm": 0.0027848149184137583,
+ "learning_rate": 1.0932076081303442e-06,
+ "loss": 0.0,
+ "step": 383
+ },
+ {
+ "epoch": 4.46875,
+ "grad_norm": 0.00223803473636508,
+ "learning_rate": 1.0774137170573826e-06,
+ "loss": 0.0,
+ "step": 384
+ },
+ {
+ "epoch": 4.48046875,
+ "grad_norm": 0.0018013437511399388,
+ "learning_rate": 1.0617033494534486e-06,
+ "loss": 0.0,
+ "step": 385
+ },
+ {
+ "epoch": 4.4921875,
+ "grad_norm": 0.0027079912833869457,
+ "learning_rate": 1.0460774277120866e-06,
+ "loss": 0.0,
+ "step": 386
+ },
+ {
+ "epoch": 4.50390625,
+ "grad_norm": 0.002311108633875847,
+ "learning_rate": 1.0305368692688175e-06,
+ "loss": 0.0,
+ "step": 387
+ },
+ {
+ "epoch": 4.515625,
+ "grad_norm": 0.001729196636006236,
+ "learning_rate": 1.0150825865472813e-06,
+ "loss": 0.0,
+ "step": 388
+ },
+ {
+ "epoch": 4.52734375,
+ "grad_norm": 0.002961450256407261,
+ "learning_rate": 9.997154869056588e-07,
+ "loss": 0.0,
+ "step": 389
+ },
+ {
+ "epoch": 4.5390625,
+ "grad_norm": 0.002972877351567149,
+ "learning_rate": 9.844364725834058e-07,
+ "loss": 0.0,
+ "step": 390
+ },
+ {
+ "epoch": 4.55078125,
+ "grad_norm": 0.0008791300351731479,
+ "learning_rate": 9.692464406482727e-07,
+ "loss": 0.0,
+ "step": 391
+ },
+ {
+ "epoch": 4.5625,
+ "grad_norm": 0.0018361720722168684,
+ "learning_rate": 9.541462829436426e-07,
+ "loss": 0.0,
+ "step": 392
+ },
+ {
+ "epoch": 4.57421875,
+ "grad_norm": 0.0029881680384278297,
+ "learning_rate": 9.39136886036166e-07,
+ "loss": 0.0,
+ "step": 393
+ },
+ {
+ "epoch": 4.5859375,
+ "grad_norm": 0.0030923946760594845,
+ "learning_rate": 9.24219131163705e-07,
+ "loss": 0.0,
+ "step": 394
+ },
+ {
+ "epoch": 4.59765625,
+ "grad_norm": 0.0014424376422539353,
+ "learning_rate": 9.093938941836012e-07,
+ "loss": 0.0,
+ "step": 395
+ },
+ {
+ "epoch": 4.609375,
+ "grad_norm": 0.0018437248654663563,
+ "learning_rate": 8.946620455212438e-07,
+ "loss": 0.0,
+ "step": 396
+ },
+ {
+ "epoch": 4.62109375,
+ "grad_norm": 0.0035209229681640863,
+ "learning_rate": 8.800244501189722e-07,
+ "loss": 0.0003,
+ "step": 397
+ },
+ {
+ "epoch": 4.6328125,
+ "grad_norm": 0.19659285247325897,
+ "learning_rate": 8.654819673852874e-07,
+ "loss": 0.0,
+ "step": 398
+ },
+ {
+ "epoch": 4.64453125,
+ "grad_norm": 0.17559310793876648,
+ "learning_rate": 8.510354511443975e-07,
+ "loss": 0.0003,
+ "step": 399
+ },
+ {
+ "epoch": 4.65625,
+ "grad_norm": 0.0017143903532996774,
+ "learning_rate": 8.366857495860869e-07,
+ "loss": 0.0,
+ "step": 400
+ },
+ {
+ "epoch": 4.66796875,
+ "grad_norm": 0.008345520123839378,
+ "learning_rate": 8.224337052159154e-07,
+ "loss": 0.0,
+ "step": 401
+ },
+ {
+ "epoch": 4.6796875,
+ "grad_norm": 0.001156082609668374,
+ "learning_rate": 8.082801548057553e-07,
+ "loss": 0.0,
+ "step": 402
+ },
+ {
+ "epoch": 4.69140625,
+ "grad_norm": 0.0014560276176780462,
+ "learning_rate": 7.942259293446594e-07,
+ "loss": 0.0,
+ "step": 403
+ },
+ {
+ "epoch": 4.703125,
+ "grad_norm": 0.0013030421687290072,
+ "learning_rate": 7.802718539900761e-07,
+ "loss": 0.0,
+ "step": 404
+ },
+ {
+ "epoch": 4.71484375,
+ "grad_norm": 0.0012356194201856852,
+ "learning_rate": 7.66418748019396e-07,
+ "loss": 0.0,
+ "step": 405
+ },
+ {
+ "epoch": 4.7265625,
+ "grad_norm": 0.004214293789118528,
+ "learning_rate": 7.526674247818569e-07,
+ "loss": 0.0,
+ "step": 406
+ },
+ {
+ "epoch": 4.73828125,
+ "grad_norm": 0.003626940306276083,
+ "learning_rate": 7.390186916507869e-07,
+ "loss": 0.0,
+ "step": 407
+ },
+ {
+ "epoch": 4.75,
+ "grad_norm": 0.003801507642492652,
+ "learning_rate": 7.254733499761993e-07,
+ "loss": 0.0,
+ "step": 408
+ },
+ {
+ "epoch": 4.76171875,
+ "grad_norm": 0.0023032291792333126,
+ "learning_rate": 7.120321950377487e-07,
+ "loss": 0.0,
+ "step": 409
+ },
+ {
+ "epoch": 4.7734375,
+ "grad_norm": 0.0018953473772853613,
+ "learning_rate": 6.986960159980327e-07,
+ "loss": 0.0,
+ "step": 410
+ },
+ {
+ "epoch": 4.78515625,
+ "grad_norm": 0.0011394222965463996,
+ "learning_rate": 6.854655958562625e-07,
+ "loss": 0.0,
+ "step": 411
+ },
+ {
+ "epoch": 4.796875,
+ "grad_norm": 0.0021377848461270332,
+ "learning_rate": 6.723417114022907e-07,
+ "loss": 0.0,
+ "step": 412
+ },
+ {
+ "epoch": 4.80859375,
+ "grad_norm": 0.0011264781933277845,
+ "learning_rate": 6.593251331709993e-07,
+ "loss": 0.0,
+ "step": 413
+ },
+ {
+ "epoch": 4.8203125,
+ "grad_norm": 0.004995762836188078,
+ "learning_rate": 6.464166253970672e-07,
+ "loss": 0.0,
+ "step": 414
+ },
+ {
+ "epoch": 4.83203125,
+ "grad_norm": 0.0014515212969854474,
+ "learning_rate": 6.336169459700933e-07,
+ "loss": 0.0,
+ "step": 415
+ },
+ {
+ "epoch": 4.84375,
+ "grad_norm": 0.000913277908693999,
+ "learning_rate": 6.209268463901047e-07,
+ "loss": 0.0,
+ "step": 416
+ },
+ {
+ "epoch": 4.85546875,
+ "grad_norm": 0.010075507685542107,
+ "learning_rate": 6.083470717234285e-07,
+ "loss": 0.0,
+ "step": 417
+ },
+ {
+ "epoch": 4.8671875,
+ "grad_norm": 0.0015437327092513442,
+ "learning_rate": 5.95878360558953e-07,
+ "loss": 0.0,
+ "step": 418
+ },
+ {
+ "epoch": 4.87890625,
+ "grad_norm": 0.0008694503339938819,
+ "learning_rate": 5.835214449647602e-07,
+ "loss": 0.0,
+ "step": 419
+ },
+ {
+ "epoch": 4.890625,
+ "grad_norm": 0.003764442168176174,
+ "learning_rate": 5.712770504451426e-07,
+ "loss": 0.0,
+ "step": 420
+ },
+ {
+ "epoch": 4.90234375,
+ "grad_norm": 0.0019374670227989554,
+ "learning_rate": 5.591458958980123e-07,
+ "loss": 0.0,
+ "step": 421
+ },
+ {
+ "epoch": 4.9140625,
+ "grad_norm": 0.00113675557076931,
+ "learning_rate": 5.471286935726866e-07,
+ "loss": 0.0,
+ "step": 422
+ },
+ {
+ "epoch": 4.92578125,
+ "grad_norm": 0.001957179745659232,
+ "learning_rate": 5.352261490280767e-07,
+ "loss": 0.0,
+ "step": 423
+ },
+ {
+ "epoch": 4.9375,
+ "grad_norm": 0.00735822319984436,
+ "learning_rate": 5.234389610912552e-07,
+ "loss": 0.0,
+ "step": 424
+ },
+ {
+ "epoch": 4.94921875,
+ "grad_norm": 0.0010691111674532294,
+ "learning_rate": 5.117678218164337e-07,
+ "loss": 0.0,
+ "step": 425
+ },
+ {
+ "epoch": 4.9609375,
+ "grad_norm": 0.15340374410152435,
+ "learning_rate": 5.002134164443262e-07,
+ "loss": 0.0003,
+ "step": 426
+ },
+ {
+ "epoch": 4.97265625,
+ "grad_norm": 0.0033372677862644196,
+ "learning_rate": 4.887764233619163e-07,
+ "loss": 0.0,
+ "step": 427
+ },
+ {
+ "epoch": 4.984375,
+ "grad_norm": 0.005183639004826546,
+ "learning_rate": 4.774575140626317e-07,
+ "loss": 0.0,
+ "step": 428
+ },
+ {
+ "epoch": 4.99609375,
+ "grad_norm": 0.003231282811611891,
+ "learning_rate": 4.6625735310691396e-07,
+ "loss": 0.0,
+ "step": 429
+ },
+ {
+ "epoch": 5.0,
+ "grad_norm": 0.003231282811611891,
+ "learning_rate": 4.55176598083206e-07,
+ "loss": 0.0,
+ "step": 430
+ },
+ {
+ "epoch": 5.01171875,
+ "grad_norm": 0.0033447262831032276,
+ "learning_rate": 4.4421589956933827e-07,
+ "loss": 0.0,
+ "step": 431
+ },
+ {
+ "epoch": 5.0234375,
+ "grad_norm": 0.0028587093111127615,
+ "learning_rate": 4.3337590109433505e-07,
+ "loss": 0.0,
+ "step": 432
+ },
+ {
+ "epoch": 5.03515625,
+ "grad_norm": 0.0016057805623859167,
+ "learning_rate": 4.22657239100632e-07,
+ "loss": 0.0,
+ "step": 433
+ },
+ {
+ "epoch": 5.046875,
+ "grad_norm": 0.0019842074252665043,
+ "learning_rate": 4.1206054290670537e-07,
+ "loss": 0.0,
+ "step": 434
+ },
+ {
+ "epoch": 5.05859375,
+ "grad_norm": 0.0024509418290108442,
+ "learning_rate": 4.015864346701251e-07,
+ "loss": 0.0,
+ "step": 435
+ },
+ {
+ "epoch": 5.0703125,
+ "grad_norm": 0.001955198124051094,
+ "learning_rate": 3.9123552935102976e-07,
+ "loss": 0.0,
+ "step": 436
+ },
+ {
+ "epoch": 5.08203125,
+ "grad_norm": 0.001240650424733758,
+ "learning_rate": 3.81008434676014e-07,
+ "loss": 0.0,
+ "step": 437
+ },
+ {
+ "epoch": 5.09375,
+ "grad_norm": 0.0014848706778138876,
+ "learning_rate": 3.709057511024541e-07,
+ "loss": 0.0,
+ "step": 438
+ },
+ {
+ "epoch": 5.10546875,
+ "grad_norm": 0.0021577742882072926,
+ "learning_rate": 3.609280717832489e-07,
+ "loss": 0.0,
+ "step": 439
+ },
+ {
+ "epoch": 5.1171875,
+ "grad_norm": 0.0019075453747063875,
+ "learning_rate": 3.510759825319976e-07,
+ "loss": 0.0,
+ "step": 440
+ },
+ {
+ "epoch": 5.12890625,
+ "grad_norm": 0.001239179284311831,
+ "learning_rate": 3.413500617886023e-07,
+ "loss": 0.0,
+ "step": 441
+ },
+ {
+ "epoch": 5.140625,
+ "grad_norm": 0.004561480600386858,
+ "learning_rate": 3.3175088058530925e-07,
+ "loss": 0.0,
+ "step": 442
+ },
+ {
+ "epoch": 5.15234375,
+ "grad_norm": 0.006389022804796696,
+ "learning_rate": 3.2227900251318055e-07,
+ "loss": 0.0,
+ "step": 443
+ },
+ {
+ "epoch": 5.1640625,
+ "grad_norm": 0.0017025156412273645,
+ "learning_rate": 3.1293498368900414e-07,
+ "loss": 0.0,
+ "step": 444
+ },
+ {
+ "epoch": 5.17578125,
+ "grad_norm": 0.001661973656155169,
+ "learning_rate": 3.0371937272264454e-07,
+ "loss": 0.0,
+ "step": 445
+ },
+ {
+ "epoch": 5.1875,
+ "grad_norm": 0.0022532425355166197,
+ "learning_rate": 2.9463271068482955e-07,
+ "loss": 0.0,
+ "step": 446
+ },
+ {
+ "epoch": 5.19921875,
+ "grad_norm": 0.07497455179691315,
+ "learning_rate": 2.856755310753867e-07,
+ "loss": 0.0001,
+ "step": 447
+ },
+ {
+ "epoch": 5.2109375,
+ "grad_norm": 0.002289533382281661,
+ "learning_rate": 2.7684835979191664e-07,
+ "loss": 0.0,
+ "step": 448
+ },
+ {
+ "epoch": 5.22265625,
+ "grad_norm": 0.008681479841470718,
+ "learning_rate": 2.681517150989185e-07,
+ "loss": 0.0,
+ "step": 449
+ },
+ {
+ "epoch": 5.234375,
+ "grad_norm": 0.0013883741339668632,
+ "learning_rate": 2.5958610759736133e-07,
+ "loss": 0.0,
+ "step": 450
+ },
+ {
+ "epoch": 5.24609375,
+ "grad_norm": 0.009817993268370628,
+ "learning_rate": 2.511520401947032e-07,
+ "loss": 0.0,
+ "step": 451
+ },
+ {
+ "epoch": 5.2578125,
+ "grad_norm": 0.003581307828426361,
+ "learning_rate": 2.428500080753676e-07,
+ "loss": 0.0,
+ "step": 452
+ },
+ {
+ "epoch": 5.26953125,
+ "grad_norm": 0.0025084957014769316,
+ "learning_rate": 2.3468049867166747e-07,
+ "loss": 0.0,
+ "step": 453
+ },
+ {
+ "epoch": 5.28125,
+ "grad_norm": 0.0008235024870373309,
+ "learning_rate": 2.2664399163518786e-07,
+ "loss": 0.0,
+ "step": 454
+ },
+ {
+ "epoch": 5.29296875,
+ "grad_norm": 0.003393619554117322,
+ "learning_rate": 2.1874095880862505e-07,
+ "loss": 0.0,
+ "step": 455
+ },
+ {
+ "epoch": 5.3046875,
+ "grad_norm": 0.005308760330080986,
+ "learning_rate": 2.1097186419808151e-07,
+ "loss": 0.0,
+ "step": 456
+ },
+ {
+ "epoch": 5.31640625,
+ "grad_norm": 0.0014810613356530666,
+ "learning_rate": 2.0333716394582536e-07,
+ "loss": 0.0,
+ "step": 457
+ },
+ {
+ "epoch": 5.328125,
+ "grad_norm": 0.0010313258972018957,
+ "learning_rate": 1.958373063035071e-07,
+ "loss": 0.0,
+ "step": 458
+ },
+ {
+ "epoch": 5.33984375,
+ "grad_norm": 0.0015522941248491406,
+ "learning_rate": 1.8847273160584378e-07,
+ "loss": 0.0,
+ "step": 459
+ },
+ {
+ "epoch": 5.3515625,
+ "grad_norm": 0.0010389795061200857,
+ "learning_rate": 1.8124387224476347e-07,
+ "loss": 0.0,
+ "step": 460
+ },
+ {
+ "epoch": 5.36328125,
+ "grad_norm": 0.0019286195747554302,
+ "learning_rate": 1.7415115264402065e-07,
+ "loss": 0.0,
+ "step": 461
+ },
+ {
+ "epoch": 5.375,
+ "grad_norm": 0.001823669415898621,
+ "learning_rate": 1.6719498923427697e-07,
+ "loss": 0.0,
+ "step": 462
+ },
+ {
+ "epoch": 5.38671875,
+ "grad_norm": 0.0009815149242058396,
+ "learning_rate": 1.6037579042864876e-07,
+ "loss": 0.0,
+ "step": 463
+ },
+ {
+ "epoch": 5.3984375,
+ "grad_norm": 0.002994813024997711,
+ "learning_rate": 1.5369395659873305e-07,
+ "loss": 0.0,
+ "step": 464
+ },
+ {
+ "epoch": 5.41015625,
+ "grad_norm": 0.0021288918796926737,
+ "learning_rate": 1.471498800510962e-07,
+ "loss": 0.0,
+ "step": 465
+ },
+ {
+ "epoch": 5.421875,
+ "grad_norm": 0.001066897064447403,
+ "learning_rate": 1.407439450042433e-07,
+ "loss": 0.0,
+ "step": 466
+ },
+ {
+ "epoch": 5.43359375,
+ "grad_norm": 0.012361166998744011,
+ "learning_rate": 1.3447652756605894e-07,
+ "loss": 0.0,
+ "step": 467
+ },
+ {
+ "epoch": 5.4453125,
+ "grad_norm": 0.001722171320579946,
+ "learning_rate": 1.283479957117248e-07,
+ "loss": 0.0,
+ "step": 468
+ },
+ {
+ "epoch": 5.45703125,
+ "grad_norm": 0.0033318859059363604,
+ "learning_rate": 1.223587092621162e-07,
+ "loss": 0.0,
+ "step": 469
+ },
+ {
+ "epoch": 5.46875,
+ "grad_norm": 0.03496450558304787,
+ "learning_rate": 1.1650901986267365e-07,
+ "loss": 0.0,
+ "step": 470
+ },
+ {
+ "epoch": 5.48046875,
+ "grad_norm": 0.009674963541328907,
+ "learning_rate": 1.1079927096275978e-07,
+ "loss": 0.0,
+ "step": 471
+ },
+ {
+ "epoch": 5.4921875,
+ "grad_norm": 0.004603398498147726,
+ "learning_rate": 1.052297977954922e-07,
+ "loss": 0.0,
+ "step": 472
+ },
+ {
+ "epoch": 5.50390625,
+ "grad_norm": 0.00150200049392879,
+ "learning_rate": 9.98009273580633e-08,
+ "loss": 0.0,
+ "step": 473
+ },
+ {
+ "epoch": 5.515625,
+ "grad_norm": 0.0012543356278911233,
+ "learning_rate": 9.451297839253915e-08,
+ "loss": 0.0,
+ "step": 474
+ },
+ {
+ "epoch": 5.52734375,
+ "grad_norm": 0.0024099883157759905,
+ "learning_rate": 8.936626136714754e-08,
+ "loss": 0.0,
+ "step": 475
+ },
+ {
+ "epoch": 5.5390625,
+ "grad_norm": 0.001273278845474124,
+ "learning_rate": 8.436107845804842e-08,
+ "loss": 0.0,
+ "step": 476
+ },
+ {
+ "epoch": 5.55078125,
+ "grad_norm": 0.0012492879759520292,
+ "learning_rate": 7.949772353159191e-08,
+ "loss": 0.0,
+ "step": 477
+ },
+ {
+ "epoch": 5.5625,
+ "grad_norm": 0.0016244460130110383,
+ "learning_rate": 7.477648212706746e-08,
+ "loss": 0.0,
+ "step": 478
+ },
+ {
+ "epoch": 5.57421875,
+ "grad_norm": 0.002714181086048484,
+ "learning_rate": 7.019763143993441e-08,
+ "loss": 0.0,
+ "step": 479
+ },
+ {
+ "epoch": 5.5859375,
+ "grad_norm": 0.0032653063535690308,
+ "learning_rate": 6.576144030555259e-08,
+ "loss": 0.0,
+ "step": 480
+ },
+ {
+ "epoch": 5.59765625,
+ "grad_norm": 0.0022408408112823963,
+ "learning_rate": 6.14681691833935e-08,
+ "loss": 0.0,
+ "step": 481
+ },
+ {
+ "epoch": 5.609375,
+ "grad_norm": 0.0037225966807454824,
+ "learning_rate": 5.731807014175195e-08,
+ "loss": 0.0,
+ "step": 482
+ },
+ {
+ "epoch": 5.62109375,
+ "grad_norm": 0.004080182407051325,
+ "learning_rate": 5.3311386842944125e-08,
+ "loss": 0.0,
+ "step": 483
+ },
+ {
+ "epoch": 5.6328125,
+ "grad_norm": 0.0035425201058387756,
+ "learning_rate": 4.944835452900199e-08,
+ "loss": 0.0,
+ "step": 484
+ },
+ {
+ "epoch": 5.64453125,
+ "grad_norm": 0.001153574208728969,
+ "learning_rate": 4.5729200007862686e-08,
+ "loss": 0.0,
+ "step": 485
+ },
+ {
+ "epoch": 5.65625,
+ "grad_norm": 0.002798657398670912,
+ "learning_rate": 4.215414164005116e-08,
+ "loss": 0.0,
+ "step": 486
+ },
+ {
+ "epoch": 5.66796875,
+ "grad_norm": 0.08384445309638977,
+ "learning_rate": 3.872338932585984e-08,
+ "loss": 0.0001,
+ "step": 487
+ },
+ {
+ "epoch": 5.6796875,
+ "grad_norm": 0.001083986135199666,
+ "learning_rate": 3.543714449302488e-08,
+ "loss": 0.0,
+ "step": 488
+ },
+ {
+ "epoch": 5.69140625,
+ "grad_norm": 0.0016904210206121206,
+ "learning_rate": 3.229560008490007e-08,
+ "loss": 0.0,
+ "step": 489
+ },
+ {
+ "epoch": 5.703125,
+ "grad_norm": 0.00799526646733284,
+ "learning_rate": 2.9298940549128962e-08,
+ "loss": 0.0,
+ "step": 490
+ },
+ {
+ "epoch": 5.71484375,
+ "grad_norm": 0.002683044411242008,
+ "learning_rate": 2.6447341826814077e-08,
+ "loss": 0.0,
+ "step": 491
+ },
+ {
+ "epoch": 5.7265625,
+ "grad_norm": 0.001099879969842732,
+ "learning_rate": 2.3740971342189056e-08,
+ "loss": 0.0,
+ "step": 492
+ },
+ {
+ "epoch": 5.73828125,
+ "grad_norm": 0.001016490743495524,
+ "learning_rate": 2.117998799278709e-08,
+ "loss": 0.0,
+ "step": 493
+ },
+ {
+ "epoch": 5.75,
+ "grad_norm": 0.0011416857596486807,
+ "learning_rate": 1.876454214011253e-08,
+ "loss": 0.0,
+ "step": 494
+ },
+ {
+ "epoch": 5.76171875,
+ "grad_norm": 0.001162792439572513,
+ "learning_rate": 1.6494775600812418e-08,
+ "loss": 0.0,
+ "step": 495
+ },
+ {
+ "epoch": 5.7734375,
+ "grad_norm": 0.008754150941967964,
+ "learning_rate": 1.4370821638350353e-08,
+ "loss": 0.0,
+ "step": 496
+ },
+ {
+ "epoch": 5.78515625,
+ "grad_norm": 0.003359902650117874,
+ "learning_rate": 1.2392804955181915e-08,
+ "loss": 0.0,
+ "step": 497
+ },
+ {
+ "epoch": 5.796875,
+ "grad_norm": 0.001140256063081324,
+ "learning_rate": 1.0560841685433864e-08,
+ "loss": 0.0,
+ "step": 498
+ },
+ {
+ "epoch": 5.80859375,
+ "grad_norm": 0.0024218405596911907,
+ "learning_rate": 8.875039388084317e-09,
+ "loss": 0.0,
+ "step": 499
+ },
+ {
+ "epoch": 5.8203125,
+ "grad_norm": 0.0034275399520993233,
+ "learning_rate": 7.335497040648898e-09,
+ "loss": 0.0,
+ "step": 500
+ },
+ {
+ "epoch": 5.83203125,
+ "grad_norm": 0.007808190770447254,
+ "learning_rate": 5.942305033369289e-09,
+ "loss": 0.0,
+ "step": 501
+ },
+ {
+ "epoch": 5.84375,
+ "grad_norm": 0.0012946473434567451,
+ "learning_rate": 4.695545163905524e-09,
+ "loss": 0.0,
+ "step": 502
+ },
+ {
+ "epoch": 5.85546875,
+ "grad_norm": 0.0012264677789062262,
+ "learning_rate": 3.5952906325339988e-09,
+ "loss": 0.0,
+ "step": 503
+ },
+ {
+ "epoch": 5.8671875,
+ "grad_norm": 0.0008321711211465299,
+ "learning_rate": 2.641606037850353e-09,
+ "loss": 0.0,
+ "step": 504
+ },
+ {
+ "epoch": 5.87890625,
+ "grad_norm": 0.002902488224208355,
+ "learning_rate": 1.834547372975004e-09,
+ "loss": 0.0,
+ "step": 505
+ },
+ {
+ "epoch": 5.890625,
+ "grad_norm": 0.0023215031251311302,
+ "learning_rate": 1.1741620222671667e-09,
+ "loss": 0.0,
+ "step": 506
+ },
+ {
+ "epoch": 5.90234375,
+ "grad_norm": 0.0010812516557052732,
+ "learning_rate": 6.604887585426323e-10,
+ "loss": 0.0,
+ "step": 507
+ },
+ {
+ "epoch": 5.9140625,
+ "grad_norm": 0.004617776721715927,
+ "learning_rate": 2.9355774079614653e-10,
+ "loss": 0.0,
+ "step": 508
+ },
+ {
+ "epoch": 5.92578125,
+ "grad_norm": 0.0021092123351991177,
+ "learning_rate": 7.339051243254735e-11,
+ "loss": 0.0,
+ "step": 509
+ },
+ {
+ "epoch": 5.9375,
+ "grad_norm": 0.000980311306193471,
+ "learning_rate": 0.0,
+ "loss": 0.0001,
+ "step": 510
+ }
+ ],
+ "logging_steps": 1,
+ "max_steps": 510,
+ "num_input_tokens_seen": 0,
+ "num_train_epochs": 6,
+ "save_steps": 85,
+ "stateful_callbacks": {
+ "TrainerControl": {
+ "args": {
+ "should_epoch_stop": false,
+ "should_evaluate": false,
+ "should_log": false,
+ "should_save": true,
+ "should_training_stop": true
+ },
+ "attributes": {}
+ }
+ },
+ "total_flos": 1.2798585930252288e+18,
+ "train_batch_size": 4,
+ "trial_name": null,
+ "trial_params": null
+}
diff --git a/checkpoint-510/training_args.bin b/checkpoint-510/training_args.bin
new file mode 100644
index 0000000000000000000000000000000000000000..31435c2b54979c306fa2a089f64bc8d21e1d21cf
--- /dev/null
+++ b/checkpoint-510/training_args.bin
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:ae0e02a237d0ed5071f0d2c656d0cc6fa0293647ec7cffc6f8d299311f592cdc
+size 8056
diff --git a/checkpoint-510/zero_to_fp32.py b/checkpoint-510/zero_to_fp32.py
new file mode 100644
index 0000000000000000000000000000000000000000..24cc342e78d1a006c782b3a4cd68d9ce786d8fd8
--- /dev/null
+++ b/checkpoint-510/zero_to_fp32.py
@@ -0,0 +1,604 @@
+#!/usr/bin/env python
+
+# Copyright (c) Microsoft Corporation.
+# SPDX-License-Identifier: Apache-2.0
+
+# DeepSpeed Team
+
+# This script extracts fp32 consolidated weights from a zero 1, 2 and 3 DeepSpeed checkpoints. It gets
+# copied into the top level checkpoint dir, so the user can easily do the conversion at any point in
+# the future. Once extracted, the weights don't require DeepSpeed and can be used in any
+# application.
+#
+# example: python zero_to_fp32.py . pytorch_model.bin
+
+import argparse
+import torch
+import glob
+import math
+import os
+import re
+from collections import OrderedDict
+from dataclasses import dataclass
+
+# while this script doesn't use deepspeed to recover data, since the checkpoints are pickled with
+# DeepSpeed data structures it has to be available in the current python environment.
+from deepspeed.utils import logger
+from deepspeed.checkpoint.constants import (DS_VERSION, OPTIMIZER_STATE_DICT, SINGLE_PARTITION_OF_FP32_GROUPS,
+ FP32_FLAT_GROUPS, ZERO_STAGE, PARTITION_COUNT, PARAM_SHAPES, BUFFER_NAMES,
+ FROZEN_PARAM_SHAPES, FROZEN_PARAM_FRAGMENTS)
+
+
+@dataclass
+class zero_model_state:
+ buffers: dict()
+ param_shapes: dict()
+ shared_params: list
+ ds_version: int
+ frozen_param_shapes: dict()
+ frozen_param_fragments: dict()
+
+
+debug = 0
+
+# load to cpu
+device = torch.device('cpu')
+
+
+def atoi(text):
+ return int(text) if text.isdigit() else text
+
+
+def natural_keys(text):
+ '''
+ alist.sort(key=natural_keys) sorts in human order
+ http://nedbatchelder.com/blog/200712/human_sorting.html
+ (See Toothy's implementation in the comments)
+ '''
+ return [atoi(c) for c in re.split(r'(\d+)', text)]
+
+
+def get_model_state_file(checkpoint_dir, zero_stage):
+ if not os.path.isdir(checkpoint_dir):
+ raise FileNotFoundError(f"Directory '{checkpoint_dir}' doesn't exist")
+
+ # there should be only one file
+ if zero_stage <= 2:
+ file = os.path.join(checkpoint_dir, "mp_rank_00_model_states.pt")
+ elif zero_stage == 3:
+ file = os.path.join(checkpoint_dir, "zero_pp_rank_0_mp_rank_00_model_states.pt")
+
+ if not os.path.exists(file):
+ raise FileNotFoundError(f"can't find model states file at '{file}'")
+
+ return file
+
+
+def get_checkpoint_files(checkpoint_dir, glob_pattern):
+ # XXX: need to test that this simple glob rule works for multi-node setup too
+ ckpt_files = sorted(glob.glob(os.path.join(checkpoint_dir, glob_pattern)), key=natural_keys)
+
+ if len(ckpt_files) == 0:
+ raise FileNotFoundError(f"can't find {glob_pattern} files in directory '{checkpoint_dir}'")
+
+ return ckpt_files
+
+
+def get_optim_files(checkpoint_dir):
+ return get_checkpoint_files(checkpoint_dir, "*_optim_states.pt")
+
+
+def get_model_state_files(checkpoint_dir):
+ return get_checkpoint_files(checkpoint_dir, "*_model_states.pt")
+
+
+def parse_model_states(files):
+ zero_model_states = []
+ for file in files:
+ state_dict = torch.load(file, map_location=device)
+
+ if BUFFER_NAMES not in state_dict:
+ raise ValueError(f"{file} is not a model state checkpoint")
+ buffer_names = state_dict[BUFFER_NAMES]
+ if debug:
+ print("Found buffers:", buffer_names)
+
+ # recover just the buffers while restoring them to fp32 if they were saved in fp16
+ buffers = {k: v.float() for k, v in state_dict["module"].items() if k in buffer_names}
+ param_shapes = state_dict[PARAM_SHAPES]
+
+ # collect parameters that are included in param_shapes
+ param_names = []
+ for s in param_shapes:
+ for name in s.keys():
+ param_names.append(name)
+
+ # update with frozen parameters
+ frozen_param_shapes = state_dict.get(FROZEN_PARAM_SHAPES, None)
+ if frozen_param_shapes is not None:
+ if debug:
+ print(f"Found frozen_param_shapes: {frozen_param_shapes}")
+ param_names += list(frozen_param_shapes.keys())
+
+ # handle shared params
+ shared_params = [[k, v] for k, v in state_dict["shared_params"].items()]
+
+ ds_version = state_dict.get(DS_VERSION, None)
+
+ frozen_param_fragments = state_dict.get(FROZEN_PARAM_FRAGMENTS, None)
+
+ z_model_state = zero_model_state(buffers=buffers,
+ param_shapes=param_shapes,
+ shared_params=shared_params,
+ ds_version=ds_version,
+ frozen_param_shapes=frozen_param_shapes,
+ frozen_param_fragments=frozen_param_fragments)
+ zero_model_states.append(z_model_state)
+
+ return zero_model_states
+
+
+def parse_optim_states(files, ds_checkpoint_dir):
+
+ total_files = len(files)
+ state_dicts = []
+ for f in files:
+ state_dict = torch.load(f, map_location=device)
+ # immediately discard the potentially huge 2 optimizer states as we only care for fp32 master weights
+ # and also handle the case where it was already removed by another helper script
+ state_dict["optimizer_state_dict"].pop("optimizer_state_dict", None)
+ state_dicts.append(state_dict)
+
+ if not ZERO_STAGE in state_dicts[0][OPTIMIZER_STATE_DICT]:
+ raise ValueError(f"{files[0]} is not a zero checkpoint")
+ zero_stage = state_dicts[0][OPTIMIZER_STATE_DICT][ZERO_STAGE]
+ world_size = state_dicts[0][OPTIMIZER_STATE_DICT][PARTITION_COUNT]
+
+ # For ZeRO-2 each param group can have different partition_count as data parallelism for expert
+ # parameters can be different from data parallelism for non-expert parameters. So we can just
+ # use the max of the partition_count to get the dp world_size.
+
+ if type(world_size) is list:
+ world_size = max(world_size)
+
+ if world_size != total_files:
+ raise ValueError(
+ f"Expected {world_size} of '*_optim_states.pt' under '{ds_checkpoint_dir}' but found {total_files} files. "
+ "Possibly due to an overwrite of an old checkpoint, or a checkpoint didn't get saved by one or more processes."
+ )
+
+ # the groups are named differently in each stage
+ if zero_stage <= 2:
+ fp32_groups_key = SINGLE_PARTITION_OF_FP32_GROUPS
+ elif zero_stage == 3:
+ fp32_groups_key = FP32_FLAT_GROUPS
+ else:
+ raise ValueError(f"unknown zero stage {zero_stage}")
+
+ if zero_stage <= 2:
+ fp32_flat_groups = [state_dicts[i][OPTIMIZER_STATE_DICT][fp32_groups_key] for i in range(len(state_dicts))]
+ elif zero_stage == 3:
+ # if there is more than one param group, there will be multiple flattened tensors - one
+ # flattened tensor per group - for simplicity merge them into a single tensor
+ #
+ # XXX: could make the script more memory efficient for when there are multiple groups - it
+ # will require matching the sub-lists of param_shapes for each param group flattened tensor
+
+ fp32_flat_groups = [
+ torch.cat(state_dicts[i][OPTIMIZER_STATE_DICT][fp32_groups_key], 0) for i in range(len(state_dicts))
+ ]
+
+ return zero_stage, world_size, fp32_flat_groups
+
+
+def _get_fp32_state_dict_from_zero_checkpoint(ds_checkpoint_dir, exclude_frozen_parameters):
+ """
+ Returns fp32 state_dict reconstructed from ds checkpoint
+
+ Args:
+ - ``ds_checkpoint_dir``: path to the deepspeed checkpoint folder (where the optimizer files are)
+
+ """
+ print(f"Processing zero checkpoint '{ds_checkpoint_dir}'")
+
+ optim_files = get_optim_files(ds_checkpoint_dir)
+ zero_stage, world_size, fp32_flat_groups = parse_optim_states(optim_files, ds_checkpoint_dir)
+ print(f"Detected checkpoint of type zero stage {zero_stage}, world_size: {world_size}")
+
+ model_files = get_model_state_files(ds_checkpoint_dir)
+
+ zero_model_states = parse_model_states(model_files)
+ print(f'Parsing checkpoint created by deepspeed=={zero_model_states[0].ds_version}')
+
+ if zero_stage <= 2:
+ return _get_fp32_state_dict_from_zero2_checkpoint(world_size, fp32_flat_groups, zero_model_states,
+ exclude_frozen_parameters)
+ elif zero_stage == 3:
+ return _get_fp32_state_dict_from_zero3_checkpoint(world_size, fp32_flat_groups, zero_model_states,
+ exclude_frozen_parameters)
+
+
+def _zero2_merge_frozen_params(state_dict, zero_model_states):
+ if zero_model_states[0].frozen_param_shapes is None or len(zero_model_states[0].frozen_param_shapes) == 0:
+ return
+
+ frozen_param_shapes = zero_model_states[0].frozen_param_shapes
+ frozen_param_fragments = zero_model_states[0].frozen_param_fragments
+
+ if debug:
+ num_elem = sum(s.numel() for s in frozen_param_shapes.values())
+ print(f'rank 0: {FROZEN_PARAM_SHAPES}.numel = {num_elem}')
+
+ wanted_params = len(frozen_param_shapes)
+ wanted_numel = sum(s.numel() for s in frozen_param_shapes.values())
+ avail_numel = sum([p.numel() for p in frozen_param_fragments.values()])
+ print(f'Frozen params: Have {avail_numel} numels to process.')
+ print(f'Frozen params: Need {wanted_numel} numels in {wanted_params} params')
+
+ total_params = 0
+ total_numel = 0
+ for name, shape in frozen_param_shapes.items():
+ total_params += 1
+ unpartitioned_numel = shape.numel()
+ total_numel += unpartitioned_numel
+
+ state_dict[name] = frozen_param_fragments[name]
+
+ if debug:
+ print(f"{name} full shape: {shape} unpartitioned numel {unpartitioned_numel} ")
+
+ print(f"Reconstructed Frozen fp32 state dict with {total_params} params {total_numel} elements")
+
+
+def _has_callable(obj, fn):
+ attr = getattr(obj, fn, None)
+ return callable(attr)
+
+
+def _zero2_merge_trainable_params(state_dict, world_size, fp32_flat_groups, zero_model_states):
+ param_shapes = zero_model_states[0].param_shapes
+
+ # Reconstruction protocol:
+ #
+ # XXX: document this
+
+ if debug:
+ for i in range(world_size):
+ for j in range(len(fp32_flat_groups[0])):
+ print(f"{FP32_FLAT_GROUPS}[{i}][{j}].shape={fp32_flat_groups[i][j].shape}")
+
+ # XXX: memory usage doubles here (zero2)
+ num_param_groups = len(fp32_flat_groups[0])
+ merged_single_partition_of_fp32_groups = []
+ for i in range(num_param_groups):
+ merged_partitions = [sd[i] for sd in fp32_flat_groups]
+ full_single_fp32_vector = torch.cat(merged_partitions, 0)
+ merged_single_partition_of_fp32_groups.append(full_single_fp32_vector)
+ avail_numel = sum(
+ [full_single_fp32_vector.numel() for full_single_fp32_vector in merged_single_partition_of_fp32_groups])
+
+ if debug:
+ wanted_params = sum([len(shapes) for shapes in param_shapes])
+ wanted_numel = sum([sum(shape.numel() for shape in shapes.values()) for shapes in param_shapes])
+ # not asserting if there is a mismatch due to possible padding
+ print(f"Have {avail_numel} numels to process.")
+ print(f"Need {wanted_numel} numels in {wanted_params} params.")
+
+ # params
+ # XXX: for huge models that can't fit into the host's RAM we will have to recode this to support
+ # out-of-core computing solution
+ total_numel = 0
+ total_params = 0
+ for shapes, full_single_fp32_vector in zip(param_shapes, merged_single_partition_of_fp32_groups):
+ offset = 0
+ avail_numel = full_single_fp32_vector.numel()
+ for name, shape in shapes.items():
+
+ unpartitioned_numel = shape.numel() if _has_callable(shape, 'numel') else math.prod(shape)
+ total_numel += unpartitioned_numel
+ total_params += 1
+
+ if debug:
+ print(f"{name} full shape: {shape} unpartitioned numel {unpartitioned_numel} ")
+ state_dict[name] = full_single_fp32_vector.narrow(0, offset, unpartitioned_numel).view(shape)
+ offset += unpartitioned_numel
+
+ # Z2 started to align to 2*world_size to improve nccl performance. Therefore both offset and
+ # avail_numel can differ by anywhere between 0..2*world_size. Due to two unrelated complex
+ # paddings performed in the code it's almost impossible to predict the exact numbers w/o the
+ # live optimizer object, so we are checking that the numbers are within the right range
+ align_to = 2 * world_size
+
+ def zero2_align(x):
+ return align_to * math.ceil(x / align_to)
+
+ if debug:
+ print(f"original offset={offset}, avail_numel={avail_numel}")
+
+ offset = zero2_align(offset)
+ avail_numel = zero2_align(avail_numel)
+
+ if debug:
+ print(f"aligned offset={offset}, avail_numel={avail_numel}")
+
+ # Sanity check
+ if offset != avail_numel:
+ raise ValueError(f"consumed {offset} numels out of {avail_numel} - something is wrong")
+
+ print(f"Reconstructed fp32 state dict with {total_params} params {total_numel} elements")
+
+
+def _get_fp32_state_dict_from_zero2_checkpoint(world_size, fp32_flat_groups, zero_model_states,
+ exclude_frozen_parameters):
+ state_dict = OrderedDict()
+
+ # buffers
+ buffers = zero_model_states[0].buffers
+ state_dict.update(buffers)
+ if debug:
+ print(f"added {len(buffers)} buffers")
+
+ if not exclude_frozen_parameters:
+ _zero2_merge_frozen_params(state_dict, zero_model_states)
+
+ _zero2_merge_trainable_params(state_dict, world_size, fp32_flat_groups, zero_model_states)
+
+ # recover shared parameters
+ for pair in zero_model_states[0].shared_params:
+ if pair[1] in state_dict:
+ state_dict[pair[0]] = state_dict[pair[1]]
+
+ return state_dict
+
+
+def zero3_partitioned_param_info(unpartitioned_numel, world_size):
+ remainder = unpartitioned_numel % world_size
+ padding_numel = (world_size - remainder) if remainder else 0
+ partitioned_numel = math.ceil(unpartitioned_numel / world_size)
+ return partitioned_numel, padding_numel
+
+
+def _zero3_merge_frozen_params(state_dict, world_size, zero_model_states):
+ if zero_model_states[0].frozen_param_shapes is None or len(zero_model_states[0].frozen_param_shapes) == 0:
+ return
+
+ if debug:
+ for i in range(world_size):
+ num_elem = sum(s.numel() for s in zero_model_states[i].frozen_param_fragments.values())
+ print(f'rank {i}: {FROZEN_PARAM_SHAPES}.numel = {num_elem}')
+
+ frozen_param_shapes = zero_model_states[0].frozen_param_shapes
+ wanted_params = len(frozen_param_shapes)
+ wanted_numel = sum(s.numel() for s in frozen_param_shapes.values())
+ avail_numel = sum([p.numel() for p in zero_model_states[0].frozen_param_fragments.values()]) * world_size
+ print(f'Frozen params: Have {avail_numel} numels to process.')
+ print(f'Frozen params: Need {wanted_numel} numels in {wanted_params} params')
+
+ total_params = 0
+ total_numel = 0
+ for name, shape in zero_model_states[0].frozen_param_shapes.items():
+ total_params += 1
+ unpartitioned_numel = shape.numel()
+ total_numel += unpartitioned_numel
+
+ param_frags = tuple(model_state.frozen_param_fragments[name] for model_state in zero_model_states)
+ state_dict[name] = torch.cat(param_frags, 0).narrow(0, 0, unpartitioned_numel).view(shape)
+
+ partitioned_numel, partitioned_padding_numel = zero3_partitioned_param_info(unpartitioned_numel, world_size)
+
+ if debug:
+ print(
+ f"Frozen params: {total_params} {name} full shape: {shape} partition0 numel={partitioned_numel} partitioned_padding_numel={partitioned_padding_numel}"
+ )
+
+ print(f"Reconstructed Frozen fp32 state dict with {total_params} params {total_numel} elements")
+
+
+def _zero3_merge_trainable_params(state_dict, world_size, fp32_flat_groups, zero_model_states):
+ param_shapes = zero_model_states[0].param_shapes
+ avail_numel = fp32_flat_groups[0].numel() * world_size
+ # Reconstruction protocol: For zero3 we need to zip the partitions together at boundary of each
+ # param, re-consolidating each param, while dealing with padding if any
+
+ # merge list of dicts, preserving order
+ param_shapes = {k: v for d in param_shapes for k, v in d.items()}
+
+ if debug:
+ for i in range(world_size):
+ print(f"{FP32_FLAT_GROUPS}[{i}].shape={fp32_flat_groups[i].shape}")
+
+ wanted_params = len(param_shapes)
+ wanted_numel = sum(shape.numel() for shape in param_shapes.values())
+ # not asserting if there is a mismatch due to possible padding
+ avail_numel = fp32_flat_groups[0].numel() * world_size
+ print(f"Trainable params: Have {avail_numel} numels to process.")
+ print(f"Trainable params: Need {wanted_numel} numels in {wanted_params} params.")
+
+ # params
+ # XXX: for huge models that can't fit into the host's RAM we will have to recode this to support
+ # out-of-core computing solution
+ offset = 0
+ total_numel = 0
+ total_params = 0
+ for name, shape in param_shapes.items():
+
+ unpartitioned_numel = shape.numel()
+ total_numel += unpartitioned_numel
+ total_params += 1
+
+ partitioned_numel, partitioned_padding_numel = zero3_partitioned_param_info(unpartitioned_numel, world_size)
+
+ if debug:
+ print(
+ f"Trainable params: {total_params} {name} full shape: {shape} partition0 numel={partitioned_numel} partitioned_padding_numel={partitioned_padding_numel}"
+ )
+
+ # XXX: memory usage doubles here
+ state_dict[name] = torch.cat(
+ tuple(fp32_flat_groups[i].narrow(0, offset, partitioned_numel) for i in range(world_size)),
+ 0).narrow(0, 0, unpartitioned_numel).view(shape)
+ offset += partitioned_numel
+
+ offset *= world_size
+
+ # Sanity check
+ if offset != avail_numel:
+ raise ValueError(f"consumed {offset} numels out of {avail_numel} - something is wrong")
+
+ print(f"Reconstructed Trainable fp32 state dict with {total_params} params {total_numel} elements")
+
+
+def _get_fp32_state_dict_from_zero3_checkpoint(world_size, fp32_flat_groups, zero_model_states,
+ exclude_frozen_parameters):
+ state_dict = OrderedDict()
+
+ # buffers
+ buffers = zero_model_states[0].buffers
+ state_dict.update(buffers)
+ if debug:
+ print(f"added {len(buffers)} buffers")
+
+ if not exclude_frozen_parameters:
+ _zero3_merge_frozen_params(state_dict, world_size, zero_model_states)
+
+ _zero3_merge_trainable_params(state_dict, world_size, fp32_flat_groups, zero_model_states)
+
+ # recover shared parameters
+ for pair in zero_model_states[0].shared_params:
+ if pair[1] in state_dict:
+ state_dict[pair[0]] = state_dict[pair[1]]
+
+ return state_dict
+
+
+def get_fp32_state_dict_from_zero_checkpoint(checkpoint_dir, tag=None, exclude_frozen_parameters=False):
+ """
+ Convert ZeRO 2 or 3 checkpoint into a single fp32 consolidated state_dict that can be loaded with
+ ``load_state_dict()`` and used for training without DeepSpeed or shared with others, for example
+ via a model hub.
+
+ Args:
+ - ``checkpoint_dir``: path to the desired checkpoint folder
+ - ``tag``: checkpoint tag used as a unique identifier for checkpoint. If not provided will attempt to load tag in 'latest' file. e.g., ``global_step14``
+ - ``exclude_frozen_parameters``: exclude frozen parameters
+
+ Returns:
+ - pytorch ``state_dict``
+
+ Note: this approach may not work if your application doesn't have sufficient free CPU memory and
+ you may need to use the offline approach using the ``zero_to_fp32.py`` script that is saved with
+ the checkpoint.
+
+ A typical usage might be ::
+
+ from deepspeed.utils.zero_to_fp32 import get_fp32_state_dict_from_zero_checkpoint
+ # do the training and checkpoint saving
+ state_dict = get_fp32_state_dict_from_zero_checkpoint(checkpoint_dir) # already on cpu
+ model = model.cpu() # move to cpu
+ model.load_state_dict(state_dict)
+ # submit to model hub or save the model to share with others
+
+ In this example the ``model`` will no longer be usable in the deepspeed context of the same
+ application. i.e. you will need to re-initialize the deepspeed engine, since
+ ``model.load_state_dict(state_dict)`` will remove all the deepspeed magic from it.
+
+ If you want it all done for you, use ``load_state_dict_from_zero_checkpoint`` instead.
+
+ """
+ if tag is None:
+ latest_path = os.path.join(checkpoint_dir, 'latest')
+ if os.path.isfile(latest_path):
+ with open(latest_path, 'r') as fd:
+ tag = fd.read().strip()
+ else:
+ raise ValueError(f"Unable to find 'latest' file at {latest_path}")
+
+ ds_checkpoint_dir = os.path.join(checkpoint_dir, tag)
+
+ if not os.path.isdir(ds_checkpoint_dir):
+ raise FileNotFoundError(f"Directory '{ds_checkpoint_dir}' doesn't exist")
+
+ return _get_fp32_state_dict_from_zero_checkpoint(ds_checkpoint_dir, exclude_frozen_parameters)
+
+
+def convert_zero_checkpoint_to_fp32_state_dict(checkpoint_dir, output_file, tag=None, exclude_frozen_parameters=False):
+ """
+ Convert ZeRO 2 or 3 checkpoint into a single fp32 consolidated ``state_dict`` file that can be
+ loaded with ``torch.load(file)`` + ``load_state_dict()`` and used for training without DeepSpeed.
+
+ Args:
+ - ``checkpoint_dir``: path to the desired checkpoint folder. (one that contains the tag-folder, like ``global_step14``)
+ - ``output_file``: path to the pytorch fp32 state_dict output file (e.g. path/pytorch_model.bin)
+ - ``tag``: checkpoint tag used as a unique identifier for checkpoint. If not provided will attempt to load tag in the file named ``latest`` in the checkpoint folder, e.g., ``global_step14``
+ - ``exclude_frozen_parameters``: exclude frozen parameters
+ """
+
+ state_dict = get_fp32_state_dict_from_zero_checkpoint(checkpoint_dir, tag, exclude_frozen_parameters)
+ print(f"Saving fp32 state dict to {output_file}")
+ torch.save(state_dict, output_file)
+
+
+def load_state_dict_from_zero_checkpoint(model, checkpoint_dir, tag=None):
+ """
+ 1. Put the provided model to cpu
+ 2. Convert ZeRO 2 or 3 checkpoint into a single fp32 consolidated ``state_dict``
+ 3. Load it into the provided model
+
+ Args:
+ - ``model``: the model object to update
+ - ``checkpoint_dir``: path to the desired checkpoint folder. (one that contains the tag-folder, like ``global_step14``)
+ - ``tag``: checkpoint tag used as a unique identifier for checkpoint. If not provided will attempt to load tag in the file named ``latest`` in the checkpoint folder, e.g., ``global_step14``
+
+ Returns:
+ - ``model`: modified model
+
+ Make sure you have plenty of CPU memory available before you call this function. If you don't
+ have enough use the ``zero_to_fp32.py`` utility to do the conversion. You will find it
+ conveniently placed for you in the checkpoint folder.
+
+ A typical usage might be ::
+
+ from deepspeed.utils.zero_to_fp32 import load_state_dict_from_zero_checkpoint
+ model = load_state_dict_from_zero_checkpoint(trainer.model, checkpoint_dir)
+ # submit to model hub or save the model to share with others
+
+ Note, that once this was run, the ``model`` will no longer be usable in the deepspeed context
+ of the same application. i.e. you will need to re-initialize the deepspeed engine, since
+ ``model.load_state_dict(state_dict)`` will remove all the deepspeed magic from it.
+
+ """
+ logger.info(f"Extracting fp32 weights")
+ state_dict = get_fp32_state_dict_from_zero_checkpoint(checkpoint_dir, tag)
+
+ logger.info(f"Overwriting model with fp32 weights")
+ model = model.cpu()
+ model.load_state_dict(state_dict, strict=False)
+
+ return model
+
+
+if __name__ == "__main__":
+
+ parser = argparse.ArgumentParser()
+ parser.add_argument("checkpoint_dir",
+ type=str,
+ help="path to the desired checkpoint folder, e.g., path/checkpoint-12")
+ parser.add_argument(
+ "output_file",
+ type=str,
+ help="path to the pytorch fp32 state_dict output file (e.g. path/checkpoint-12/pytorch_model.bin)")
+ parser.add_argument("-t",
+ "--tag",
+ type=str,
+ default=None,
+ help="checkpoint tag used as a unique identifier for checkpoint. e.g., global_step1")
+ parser.add_argument("--exclude_frozen_parameters", action='store_true', help="exclude frozen parameters")
+ parser.add_argument("-d", "--debug", action='store_true', help="enable debug")
+ args = parser.parse_args()
+
+ debug = args.debug
+
+ convert_zero_checkpoint_to_fp32_state_dict(args.checkpoint_dir,
+ args.output_file,
+ tag=args.tag,
+ exclude_frozen_parameters=args.exclude_frozen_parameters)