penfever
commited on
Commit
·
4cf2836
0
Parent(s):
Reset repository without checkpoint dirs
Browse files- .gitattributes +36 -0
- README.md +61 -0
- added_tokens.json +28 -0
- all_results.json +16 -0
- chat_template.jinja +89 -0
- config.json +68 -0
- generation_config.json +12 -0
- merges.txt +0 -0
- model-00001-of-00004.safetensors +3 -0
- model-00002-of-00004.safetensors +3 -0
- model-00003-of-00004.safetensors +3 -0
- model-00004-of-00004.safetensors +3 -0
- model.safetensors.index.json +407 -0
- run_summary.json +12 -0
- special_tokens_map.json +31 -0
- tokenizer.json +3 -0
- tokenizer_config.json +240 -0
- train_results.json +16 -0
- trainer_log.jsonl +331 -0
- trainer_state.json +0 -0
- training_args.bin +3 -0
- training_loss.png +0 -0
- vocab.json +0 -0
.gitattributes
ADDED
|
@@ -0,0 +1,36 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
*.7z filter=lfs diff=lfs merge=lfs -text
|
| 2 |
+
*.arrow filter=lfs diff=lfs merge=lfs -text
|
| 3 |
+
*.bin filter=lfs diff=lfs merge=lfs -text
|
| 4 |
+
*.bz2 filter=lfs diff=lfs merge=lfs -text
|
| 5 |
+
*.ckpt filter=lfs diff=lfs merge=lfs -text
|
| 6 |
+
*.ftz filter=lfs diff=lfs merge=lfs -text
|
| 7 |
+
*.gz filter=lfs diff=lfs merge=lfs -text
|
| 8 |
+
*.h5 filter=lfs diff=lfs merge=lfs -text
|
| 9 |
+
*.joblib filter=lfs diff=lfs merge=lfs -text
|
| 10 |
+
*.lfs.* filter=lfs diff=lfs merge=lfs -text
|
| 11 |
+
*.mlmodel filter=lfs diff=lfs merge=lfs -text
|
| 12 |
+
*.model filter=lfs diff=lfs merge=lfs -text
|
| 13 |
+
*.msgpack filter=lfs diff=lfs merge=lfs -text
|
| 14 |
+
*.npy filter=lfs diff=lfs merge=lfs -text
|
| 15 |
+
*.npz filter=lfs diff=lfs merge=lfs -text
|
| 16 |
+
*.onnx filter=lfs diff=lfs merge=lfs -text
|
| 17 |
+
*.ot filter=lfs diff=lfs merge=lfs -text
|
| 18 |
+
*.parquet filter=lfs diff=lfs merge=lfs -text
|
| 19 |
+
*.pb filter=lfs diff=lfs merge=lfs -text
|
| 20 |
+
*.pickle filter=lfs diff=lfs merge=lfs -text
|
| 21 |
+
*.pkl filter=lfs diff=lfs merge=lfs -text
|
| 22 |
+
*.pt filter=lfs diff=lfs merge=lfs -text
|
| 23 |
+
*.pth filter=lfs diff=lfs merge=lfs -text
|
| 24 |
+
*.rar filter=lfs diff=lfs merge=lfs -text
|
| 25 |
+
*.safetensors filter=lfs diff=lfs merge=lfs -text
|
| 26 |
+
saved_model/**/* filter=lfs diff=lfs merge=lfs -text
|
| 27 |
+
*.tar.* filter=lfs diff=lfs merge=lfs -text
|
| 28 |
+
*.tar filter=lfs diff=lfs merge=lfs -text
|
| 29 |
+
*.tflite filter=lfs diff=lfs merge=lfs -text
|
| 30 |
+
*.tgz filter=lfs diff=lfs merge=lfs -text
|
| 31 |
+
*.wasm filter=lfs diff=lfs merge=lfs -text
|
| 32 |
+
*.xz filter=lfs diff=lfs merge=lfs -text
|
| 33 |
+
*.zip filter=lfs diff=lfs merge=lfs -text
|
| 34 |
+
*.zst filter=lfs diff=lfs merge=lfs -text
|
| 35 |
+
*tfevents* filter=lfs diff=lfs merge=lfs -text
|
| 36 |
+
tokenizer.json filter=lfs diff=lfs merge=lfs -text
|
README.md
ADDED
|
@@ -0,0 +1,61 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
---
|
| 2 |
+
library_name: transformers
|
| 3 |
+
license: apache-2.0
|
| 4 |
+
base_model: Qwen/Qwen3-8B
|
| 5 |
+
tags:
|
| 6 |
+
- llama-factory
|
| 7 |
+
- full
|
| 8 |
+
- generated_from_trainer
|
| 9 |
+
model-index:
|
| 10 |
+
- name: claude-4-5-sonnet-thinking-stackexchange-overflow-32ep-32k-traces
|
| 11 |
+
results: []
|
| 12 |
+
---
|
| 13 |
+
|
| 14 |
+
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
| 15 |
+
should probably proofread and complete it, then remove this comment. -->
|
| 16 |
+
|
| 17 |
+
# claude-4-5-sonnet-thinking-stackexchange-overflow-32ep-32k-traces
|
| 18 |
+
|
| 19 |
+
This model is a fine-tuned version of [Qwen/Qwen3-8B](https://huggingface.co/Qwen/Qwen3-8B) on the DCAgent2/claude-4-5-sonnet-thinking-stackexchange-overflow-32ep-32k-traces dataset.
|
| 20 |
+
|
| 21 |
+
## Model description
|
| 22 |
+
|
| 23 |
+
More information needed
|
| 24 |
+
|
| 25 |
+
## Intended uses & limitations
|
| 26 |
+
|
| 27 |
+
More information needed
|
| 28 |
+
|
| 29 |
+
## Training and evaluation data
|
| 30 |
+
|
| 31 |
+
More information needed
|
| 32 |
+
|
| 33 |
+
## Training procedure
|
| 34 |
+
|
| 35 |
+
### Training hyperparameters
|
| 36 |
+
|
| 37 |
+
The following hyperparameters were used during training:
|
| 38 |
+
- learning_rate: 4e-05
|
| 39 |
+
- train_batch_size: 1
|
| 40 |
+
- eval_batch_size: 8
|
| 41 |
+
- seed: 42
|
| 42 |
+
- distributed_type: multi-GPU
|
| 43 |
+
- num_devices: 8
|
| 44 |
+
- gradient_accumulation_steps: 2
|
| 45 |
+
- total_train_batch_size: 16
|
| 46 |
+
- total_eval_batch_size: 64
|
| 47 |
+
- optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.98) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
|
| 48 |
+
- lr_scheduler_type: cosine
|
| 49 |
+
- lr_scheduler_warmup_ratio: 0.1
|
| 50 |
+
- num_epochs: 7.0
|
| 51 |
+
|
| 52 |
+
### Training results
|
| 53 |
+
|
| 54 |
+
|
| 55 |
+
|
| 56 |
+
### Framework versions
|
| 57 |
+
|
| 58 |
+
- Transformers 4.56.1
|
| 59 |
+
- Pytorch 2.9.1+cu128
|
| 60 |
+
- Datasets 4.4.1
|
| 61 |
+
- Tokenizers 0.22.1
|
added_tokens.json
ADDED
|
@@ -0,0 +1,28 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"</think>": 151668,
|
| 3 |
+
"</tool_call>": 151658,
|
| 4 |
+
"</tool_response>": 151666,
|
| 5 |
+
"<think>": 151667,
|
| 6 |
+
"<tool_call>": 151657,
|
| 7 |
+
"<tool_response>": 151665,
|
| 8 |
+
"<|box_end|>": 151649,
|
| 9 |
+
"<|box_start|>": 151648,
|
| 10 |
+
"<|endoftext|>": 151643,
|
| 11 |
+
"<|file_sep|>": 151664,
|
| 12 |
+
"<|fim_middle|>": 151660,
|
| 13 |
+
"<|fim_pad|>": 151662,
|
| 14 |
+
"<|fim_prefix|>": 151659,
|
| 15 |
+
"<|fim_suffix|>": 151661,
|
| 16 |
+
"<|im_end|>": 151645,
|
| 17 |
+
"<|im_start|>": 151644,
|
| 18 |
+
"<|image_pad|>": 151655,
|
| 19 |
+
"<|object_ref_end|>": 151647,
|
| 20 |
+
"<|object_ref_start|>": 151646,
|
| 21 |
+
"<|quad_end|>": 151651,
|
| 22 |
+
"<|quad_start|>": 151650,
|
| 23 |
+
"<|repo_name|>": 151663,
|
| 24 |
+
"<|video_pad|>": 151656,
|
| 25 |
+
"<|vision_end|>": 151653,
|
| 26 |
+
"<|vision_pad|>": 151654,
|
| 27 |
+
"<|vision_start|>": 151652
|
| 28 |
+
}
|
all_results.json
ADDED
|
@@ -0,0 +1,16 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"achieved_tflops_per_gpu": 2.7067145335487743,
|
| 3 |
+
"achieved_tflops_per_gpu_theoretical": 88.08551863340236,
|
| 4 |
+
"epoch": 7.0,
|
| 5 |
+
"loss_nan_ranks": 0,
|
| 6 |
+
"loss_rank_avg": 0.17071497440338135,
|
| 7 |
+
"mfu_percent": 0.867536709470761,
|
| 8 |
+
"mfu_percent_theoretical": 28.232538023526395,
|
| 9 |
+
"total_flos": 1.3627384946606735e+18,
|
| 10 |
+
"train_loss": 0.19606392417490914,
|
| 11 |
+
"train_runtime": 62933.2387,
|
| 12 |
+
"train_samples_per_second": 0.419,
|
| 13 |
+
"train_steps_per_second": 0.026,
|
| 14 |
+
"valid_targets_mean": 16957.1,
|
| 15 |
+
"valid_targets_min": 7571
|
| 16 |
+
}
|
chat_template.jinja
ADDED
|
@@ -0,0 +1,89 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{%- if tools %}
|
| 2 |
+
{{- '<|im_start|>system\n' }}
|
| 3 |
+
{%- if messages[0].role == 'system' %}
|
| 4 |
+
{{- messages[0].content + '\n\n' }}
|
| 5 |
+
{%- endif %}
|
| 6 |
+
{{- "# Tools\n\nYou may call one or more functions to assist with the user query.\n\nYou are provided with function signatures within <tools></tools> XML tags:\n<tools>" }}
|
| 7 |
+
{%- for tool in tools %}
|
| 8 |
+
{{- "\n" }}
|
| 9 |
+
{{- tool | tojson }}
|
| 10 |
+
{%- endfor %}
|
| 11 |
+
{{- "\n</tools>\n\nFor each function call, return a json object with function name and arguments within <tool_call></tool_call> XML tags:\n<tool_call>\n{\"name\": <function-name>, \"arguments\": <args-json-object>}\n</tool_call><|im_end|>\n" }}
|
| 12 |
+
{%- else %}
|
| 13 |
+
{%- if messages[0].role == 'system' %}
|
| 14 |
+
{{- '<|im_start|>system\n' + messages[0].content + '<|im_end|>\n' }}
|
| 15 |
+
{%- endif %}
|
| 16 |
+
{%- endif %}
|
| 17 |
+
{%- set ns = namespace(multi_step_tool=true, last_query_index=messages|length - 1) %}
|
| 18 |
+
{%- for message in messages[::-1] %}
|
| 19 |
+
{%- set index = (messages|length - 1) - loop.index0 %}
|
| 20 |
+
{%- if ns.multi_step_tool and message.role == "user" and message.content is string and not(message.content.startswith('<tool_response>') and message.content.endswith('</tool_response>')) %}
|
| 21 |
+
{%- set ns.multi_step_tool = false %}
|
| 22 |
+
{%- set ns.last_query_index = index %}
|
| 23 |
+
{%- endif %}
|
| 24 |
+
{%- endfor %}
|
| 25 |
+
{%- for message in messages %}
|
| 26 |
+
{%- if message.content is string %}
|
| 27 |
+
{%- set content = message.content %}
|
| 28 |
+
{%- else %}
|
| 29 |
+
{%- set content = '' %}
|
| 30 |
+
{%- endif %}
|
| 31 |
+
{%- if (message.role == "user") or (message.role == "system" and not loop.first) %}
|
| 32 |
+
{{- '<|im_start|>' + message.role + '\n' + content + '<|im_end|>' + '\n' }}
|
| 33 |
+
{%- elif message.role == "assistant" %}
|
| 34 |
+
{%- set reasoning_content = '' %}
|
| 35 |
+
{%- if message.reasoning_content is string %}
|
| 36 |
+
{%- set reasoning_content = message.reasoning_content %}
|
| 37 |
+
{%- else %}
|
| 38 |
+
{%- if '</think>' in content %}
|
| 39 |
+
{%- set reasoning_content = content.split('</think>')[0].rstrip('\n').split('<think>')[-1].lstrip('\n') %}
|
| 40 |
+
{%- set content = content.split('</think>')[-1].lstrip('\n') %}
|
| 41 |
+
{%- endif %}
|
| 42 |
+
{%- endif %}
|
| 43 |
+
{%- if loop.index0 > ns.last_query_index %}
|
| 44 |
+
{%- if loop.last or (not loop.last and reasoning_content) %}
|
| 45 |
+
{{- '<|im_start|>' + message.role + '\n<think>\n' + reasoning_content.strip('\n') + '\n</think>\n\n' + content.lstrip('\n') }}
|
| 46 |
+
{%- else %}
|
| 47 |
+
{{- '<|im_start|>' + message.role + '\n' + content }}
|
| 48 |
+
{%- endif %}
|
| 49 |
+
{%- else %}
|
| 50 |
+
{{- '<|im_start|>' + message.role + '\n' + content }}
|
| 51 |
+
{%- endif %}
|
| 52 |
+
{%- if message.tool_calls %}
|
| 53 |
+
{%- for tool_call in message.tool_calls %}
|
| 54 |
+
{%- if (loop.first and content) or (not loop.first) %}
|
| 55 |
+
{{- '\n' }}
|
| 56 |
+
{%- endif %}
|
| 57 |
+
{%- if tool_call.function %}
|
| 58 |
+
{%- set tool_call = tool_call.function %}
|
| 59 |
+
{%- endif %}
|
| 60 |
+
{{- '<tool_call>\n{"name": "' }}
|
| 61 |
+
{{- tool_call.name }}
|
| 62 |
+
{{- '", "arguments": ' }}
|
| 63 |
+
{%- if tool_call.arguments is string %}
|
| 64 |
+
{{- tool_call.arguments }}
|
| 65 |
+
{%- else %}
|
| 66 |
+
{{- tool_call.arguments | tojson }}
|
| 67 |
+
{%- endif %}
|
| 68 |
+
{{- '}\n</tool_call>' }}
|
| 69 |
+
{%- endfor %}
|
| 70 |
+
{%- endif %}
|
| 71 |
+
{{- '<|im_end|>\n' }}
|
| 72 |
+
{%- elif message.role == "tool" %}
|
| 73 |
+
{%- if loop.first or (messages[loop.index0 - 1].role != "tool") %}
|
| 74 |
+
{{- '<|im_start|>user' }}
|
| 75 |
+
{%- endif %}
|
| 76 |
+
{{- '\n<tool_response>\n' }}
|
| 77 |
+
{{- content }}
|
| 78 |
+
{{- '\n</tool_response>' }}
|
| 79 |
+
{%- if loop.last or (messages[loop.index0 + 1].role != "tool") %}
|
| 80 |
+
{{- '<|im_end|>\n' }}
|
| 81 |
+
{%- endif %}
|
| 82 |
+
{%- endif %}
|
| 83 |
+
{%- endfor %}
|
| 84 |
+
{%- if add_generation_prompt %}
|
| 85 |
+
{{- '<|im_start|>assistant\n' }}
|
| 86 |
+
{%- if enable_thinking is defined and enable_thinking is false %}
|
| 87 |
+
{{- '<think>\n\n</think>\n\n' }}
|
| 88 |
+
{%- endif %}
|
| 89 |
+
{%- endif %}
|
config.json
ADDED
|
@@ -0,0 +1,68 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"architectures": [
|
| 3 |
+
"Qwen3ForCausalLM"
|
| 4 |
+
],
|
| 5 |
+
"attention_bias": false,
|
| 6 |
+
"attention_dropout": 0.0,
|
| 7 |
+
"dtype": "bfloat16",
|
| 8 |
+
"eos_token_id": 151645,
|
| 9 |
+
"head_dim": 128,
|
| 10 |
+
"hidden_act": "silu",
|
| 11 |
+
"hidden_size": 4096,
|
| 12 |
+
"initializer_range": 0.02,
|
| 13 |
+
"intermediate_size": 12288,
|
| 14 |
+
"layer_types": [
|
| 15 |
+
"full_attention",
|
| 16 |
+
"full_attention",
|
| 17 |
+
"full_attention",
|
| 18 |
+
"full_attention",
|
| 19 |
+
"full_attention",
|
| 20 |
+
"full_attention",
|
| 21 |
+
"full_attention",
|
| 22 |
+
"full_attention",
|
| 23 |
+
"full_attention",
|
| 24 |
+
"full_attention",
|
| 25 |
+
"full_attention",
|
| 26 |
+
"full_attention",
|
| 27 |
+
"full_attention",
|
| 28 |
+
"full_attention",
|
| 29 |
+
"full_attention",
|
| 30 |
+
"full_attention",
|
| 31 |
+
"full_attention",
|
| 32 |
+
"full_attention",
|
| 33 |
+
"full_attention",
|
| 34 |
+
"full_attention",
|
| 35 |
+
"full_attention",
|
| 36 |
+
"full_attention",
|
| 37 |
+
"full_attention",
|
| 38 |
+
"full_attention",
|
| 39 |
+
"full_attention",
|
| 40 |
+
"full_attention",
|
| 41 |
+
"full_attention",
|
| 42 |
+
"full_attention",
|
| 43 |
+
"full_attention",
|
| 44 |
+
"full_attention",
|
| 45 |
+
"full_attention",
|
| 46 |
+
"full_attention",
|
| 47 |
+
"full_attention",
|
| 48 |
+
"full_attention",
|
| 49 |
+
"full_attention",
|
| 50 |
+
"full_attention"
|
| 51 |
+
],
|
| 52 |
+
"max_position_embeddings": 40960,
|
| 53 |
+
"max_window_layers": 36,
|
| 54 |
+
"model_type": "qwen3",
|
| 55 |
+
"num_attention_heads": 32,
|
| 56 |
+
"num_hidden_layers": 36,
|
| 57 |
+
"num_key_value_heads": 8,
|
| 58 |
+
"pad_token_id": 151643,
|
| 59 |
+
"rms_norm_eps": 1e-06,
|
| 60 |
+
"rope_scaling": null,
|
| 61 |
+
"rope_theta": 1000000,
|
| 62 |
+
"sliding_window": null,
|
| 63 |
+
"tie_word_embeddings": false,
|
| 64 |
+
"transformers_version": "4.56.1",
|
| 65 |
+
"use_cache": false,
|
| 66 |
+
"use_sliding_window": false,
|
| 67 |
+
"vocab_size": 151936
|
| 68 |
+
}
|
generation_config.json
ADDED
|
@@ -0,0 +1,12 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"do_sample": true,
|
| 3 |
+
"eos_token_id": [
|
| 4 |
+
151645,
|
| 5 |
+
151643
|
| 6 |
+
],
|
| 7 |
+
"pad_token_id": 151643,
|
| 8 |
+
"temperature": 0.6,
|
| 9 |
+
"top_k": 20,
|
| 10 |
+
"top_p": 0.95,
|
| 11 |
+
"transformers_version": "4.56.1"
|
| 12 |
+
}
|
merges.txt
ADDED
|
The diff for this file is too large to render.
See raw diff
|
|
|
model-00001-of-00004.safetensors
ADDED
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:dbe2dd2c0c3e8724a125902f63566a6b34902b398b760cee714c477dfac9f7b1
|
| 3 |
+
size 4902257696
|
model-00002-of-00004.safetensors
ADDED
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:98fe6a3f205d5fffbd8e9ac76fac267e14c383756a01df514ea2d0bf51de1b56
|
| 3 |
+
size 4915960368
|
model-00003-of-00004.safetensors
ADDED
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:d25cdd0ea812470ae614866cc750a857d37dc90c276e2031b6500ba87ed54363
|
| 3 |
+
size 4983068496
|
model-00004-of-00004.safetensors
ADDED
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:d7e4b79ffbee6cb6b7a6e6ac3fb9346f56f86095ff43899afdc74d8eb0eab0cd
|
| 3 |
+
size 1580230264
|
model.safetensors.index.json
ADDED
|
@@ -0,0 +1,407 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"metadata": {
|
| 3 |
+
"total_parameters": 308224,
|
| 4 |
+
"total_size": 16381470720
|
| 5 |
+
},
|
| 6 |
+
"weight_map": {
|
| 7 |
+
"lm_head.weight": "model-00004-of-00004.safetensors",
|
| 8 |
+
"model.embed_tokens.weight": "model-00001-of-00004.safetensors",
|
| 9 |
+
"model.layers.0.input_layernorm.weight": "model-00001-of-00004.safetensors",
|
| 10 |
+
"model.layers.0.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
|
| 11 |
+
"model.layers.0.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
|
| 12 |
+
"model.layers.0.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
|
| 13 |
+
"model.layers.0.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
|
| 14 |
+
"model.layers.0.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
|
| 15 |
+
"model.layers.0.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
|
| 16 |
+
"model.layers.0.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
|
| 17 |
+
"model.layers.0.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
|
| 18 |
+
"model.layers.0.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
|
| 19 |
+
"model.layers.0.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
|
| 20 |
+
"model.layers.1.input_layernorm.weight": "model-00001-of-00004.safetensors",
|
| 21 |
+
"model.layers.1.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
|
| 22 |
+
"model.layers.1.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
|
| 23 |
+
"model.layers.1.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
|
| 24 |
+
"model.layers.1.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
|
| 25 |
+
"model.layers.1.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
|
| 26 |
+
"model.layers.1.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
|
| 27 |
+
"model.layers.1.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
|
| 28 |
+
"model.layers.1.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
|
| 29 |
+
"model.layers.1.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
|
| 30 |
+
"model.layers.1.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
|
| 31 |
+
"model.layers.10.input_layernorm.weight": "model-00002-of-00004.safetensors",
|
| 32 |
+
"model.layers.10.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
|
| 33 |
+
"model.layers.10.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
|
| 34 |
+
"model.layers.10.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
|
| 35 |
+
"model.layers.10.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
|
| 36 |
+
"model.layers.10.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
|
| 37 |
+
"model.layers.10.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
|
| 38 |
+
"model.layers.10.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
|
| 39 |
+
"model.layers.10.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
|
| 40 |
+
"model.layers.10.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
|
| 41 |
+
"model.layers.10.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
|
| 42 |
+
"model.layers.11.input_layernorm.weight": "model-00002-of-00004.safetensors",
|
| 43 |
+
"model.layers.11.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
|
| 44 |
+
"model.layers.11.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
|
| 45 |
+
"model.layers.11.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
|
| 46 |
+
"model.layers.11.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
|
| 47 |
+
"model.layers.11.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
|
| 48 |
+
"model.layers.11.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
|
| 49 |
+
"model.layers.11.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
|
| 50 |
+
"model.layers.11.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
|
| 51 |
+
"model.layers.11.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
|
| 52 |
+
"model.layers.11.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
|
| 53 |
+
"model.layers.12.input_layernorm.weight": "model-00002-of-00004.safetensors",
|
| 54 |
+
"model.layers.12.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
|
| 55 |
+
"model.layers.12.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
|
| 56 |
+
"model.layers.12.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
|
| 57 |
+
"model.layers.12.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
|
| 58 |
+
"model.layers.12.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
|
| 59 |
+
"model.layers.12.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
|
| 60 |
+
"model.layers.12.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
|
| 61 |
+
"model.layers.12.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
|
| 62 |
+
"model.layers.12.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
|
| 63 |
+
"model.layers.12.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
|
| 64 |
+
"model.layers.13.input_layernorm.weight": "model-00002-of-00004.safetensors",
|
| 65 |
+
"model.layers.13.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
|
| 66 |
+
"model.layers.13.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
|
| 67 |
+
"model.layers.13.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
|
| 68 |
+
"model.layers.13.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
|
| 69 |
+
"model.layers.13.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
|
| 70 |
+
"model.layers.13.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
|
| 71 |
+
"model.layers.13.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
|
| 72 |
+
"model.layers.13.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
|
| 73 |
+
"model.layers.13.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
|
| 74 |
+
"model.layers.13.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
|
| 75 |
+
"model.layers.14.input_layernorm.weight": "model-00002-of-00004.safetensors",
|
| 76 |
+
"model.layers.14.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
|
| 77 |
+
"model.layers.14.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
|
| 78 |
+
"model.layers.14.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
|
| 79 |
+
"model.layers.14.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
|
| 80 |
+
"model.layers.14.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
|
| 81 |
+
"model.layers.14.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
|
| 82 |
+
"model.layers.14.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
|
| 83 |
+
"model.layers.14.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
|
| 84 |
+
"model.layers.14.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
|
| 85 |
+
"model.layers.14.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
|
| 86 |
+
"model.layers.15.input_layernorm.weight": "model-00002-of-00004.safetensors",
|
| 87 |
+
"model.layers.15.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
|
| 88 |
+
"model.layers.15.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
|
| 89 |
+
"model.layers.15.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
|
| 90 |
+
"model.layers.15.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
|
| 91 |
+
"model.layers.15.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
|
| 92 |
+
"model.layers.15.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
|
| 93 |
+
"model.layers.15.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
|
| 94 |
+
"model.layers.15.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
|
| 95 |
+
"model.layers.15.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
|
| 96 |
+
"model.layers.15.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
|
| 97 |
+
"model.layers.16.input_layernorm.weight": "model-00002-of-00004.safetensors",
|
| 98 |
+
"model.layers.16.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
|
| 99 |
+
"model.layers.16.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
|
| 100 |
+
"model.layers.16.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
|
| 101 |
+
"model.layers.16.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
|
| 102 |
+
"model.layers.16.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
|
| 103 |
+
"model.layers.16.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
|
| 104 |
+
"model.layers.16.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
|
| 105 |
+
"model.layers.16.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
|
| 106 |
+
"model.layers.16.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
|
| 107 |
+
"model.layers.16.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
|
| 108 |
+
"model.layers.17.input_layernorm.weight": "model-00002-of-00004.safetensors",
|
| 109 |
+
"model.layers.17.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
|
| 110 |
+
"model.layers.17.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
|
| 111 |
+
"model.layers.17.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
|
| 112 |
+
"model.layers.17.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
|
| 113 |
+
"model.layers.17.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
|
| 114 |
+
"model.layers.17.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
|
| 115 |
+
"model.layers.17.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
|
| 116 |
+
"model.layers.17.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
|
| 117 |
+
"model.layers.17.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
|
| 118 |
+
"model.layers.17.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
|
| 119 |
+
"model.layers.18.input_layernorm.weight": "model-00002-of-00004.safetensors",
|
| 120 |
+
"model.layers.18.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
|
| 121 |
+
"model.layers.18.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
|
| 122 |
+
"model.layers.18.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
|
| 123 |
+
"model.layers.18.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
|
| 124 |
+
"model.layers.18.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
|
| 125 |
+
"model.layers.18.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
|
| 126 |
+
"model.layers.18.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
|
| 127 |
+
"model.layers.18.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
|
| 128 |
+
"model.layers.18.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
|
| 129 |
+
"model.layers.18.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
|
| 130 |
+
"model.layers.19.input_layernorm.weight": "model-00002-of-00004.safetensors",
|
| 131 |
+
"model.layers.19.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
|
| 132 |
+
"model.layers.19.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
|
| 133 |
+
"model.layers.19.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
|
| 134 |
+
"model.layers.19.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
|
| 135 |
+
"model.layers.19.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
|
| 136 |
+
"model.layers.19.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
|
| 137 |
+
"model.layers.19.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
|
| 138 |
+
"model.layers.19.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
|
| 139 |
+
"model.layers.19.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
|
| 140 |
+
"model.layers.19.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
|
| 141 |
+
"model.layers.2.input_layernorm.weight": "model-00001-of-00004.safetensors",
|
| 142 |
+
"model.layers.2.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
|
| 143 |
+
"model.layers.2.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
|
| 144 |
+
"model.layers.2.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
|
| 145 |
+
"model.layers.2.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
|
| 146 |
+
"model.layers.2.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
|
| 147 |
+
"model.layers.2.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
|
| 148 |
+
"model.layers.2.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
|
| 149 |
+
"model.layers.2.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
|
| 150 |
+
"model.layers.2.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
|
| 151 |
+
"model.layers.2.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
|
| 152 |
+
"model.layers.20.input_layernorm.weight": "model-00002-of-00004.safetensors",
|
| 153 |
+
"model.layers.20.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
|
| 154 |
+
"model.layers.20.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
|
| 155 |
+
"model.layers.20.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
|
| 156 |
+
"model.layers.20.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
|
| 157 |
+
"model.layers.20.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
|
| 158 |
+
"model.layers.20.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
|
| 159 |
+
"model.layers.20.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
|
| 160 |
+
"model.layers.20.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
|
| 161 |
+
"model.layers.20.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
|
| 162 |
+
"model.layers.20.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
|
| 163 |
+
"model.layers.21.input_layernorm.weight": "model-00002-of-00004.safetensors",
|
| 164 |
+
"model.layers.21.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
|
| 165 |
+
"model.layers.21.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
|
| 166 |
+
"model.layers.21.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
|
| 167 |
+
"model.layers.21.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
|
| 168 |
+
"model.layers.21.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
|
| 169 |
+
"model.layers.21.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
|
| 170 |
+
"model.layers.21.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
|
| 171 |
+
"model.layers.21.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
|
| 172 |
+
"model.layers.21.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
|
| 173 |
+
"model.layers.21.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
|
| 174 |
+
"model.layers.22.input_layernorm.weight": "model-00003-of-00004.safetensors",
|
| 175 |
+
"model.layers.22.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
|
| 176 |
+
"model.layers.22.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
|
| 177 |
+
"model.layers.22.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
|
| 178 |
+
"model.layers.22.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
|
| 179 |
+
"model.layers.22.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
|
| 180 |
+
"model.layers.22.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
|
| 181 |
+
"model.layers.22.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
|
| 182 |
+
"model.layers.22.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
|
| 183 |
+
"model.layers.22.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
|
| 184 |
+
"model.layers.22.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
|
| 185 |
+
"model.layers.23.input_layernorm.weight": "model-00003-of-00004.safetensors",
|
| 186 |
+
"model.layers.23.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
|
| 187 |
+
"model.layers.23.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
|
| 188 |
+
"model.layers.23.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
|
| 189 |
+
"model.layers.23.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
|
| 190 |
+
"model.layers.23.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
|
| 191 |
+
"model.layers.23.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
|
| 192 |
+
"model.layers.23.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
|
| 193 |
+
"model.layers.23.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
|
| 194 |
+
"model.layers.23.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
|
| 195 |
+
"model.layers.23.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
|
| 196 |
+
"model.layers.24.input_layernorm.weight": "model-00003-of-00004.safetensors",
|
| 197 |
+
"model.layers.24.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
|
| 198 |
+
"model.layers.24.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
|
| 199 |
+
"model.layers.24.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
|
| 200 |
+
"model.layers.24.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
|
| 201 |
+
"model.layers.24.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
|
| 202 |
+
"model.layers.24.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
|
| 203 |
+
"model.layers.24.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
|
| 204 |
+
"model.layers.24.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
|
| 205 |
+
"model.layers.24.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
|
| 206 |
+
"model.layers.24.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
|
| 207 |
+
"model.layers.25.input_layernorm.weight": "model-00003-of-00004.safetensors",
|
| 208 |
+
"model.layers.25.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
|
| 209 |
+
"model.layers.25.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
|
| 210 |
+
"model.layers.25.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
|
| 211 |
+
"model.layers.25.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
|
| 212 |
+
"model.layers.25.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
|
| 213 |
+
"model.layers.25.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
|
| 214 |
+
"model.layers.25.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
|
| 215 |
+
"model.layers.25.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
|
| 216 |
+
"model.layers.25.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
|
| 217 |
+
"model.layers.25.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
|
| 218 |
+
"model.layers.26.input_layernorm.weight": "model-00003-of-00004.safetensors",
|
| 219 |
+
"model.layers.26.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
|
| 220 |
+
"model.layers.26.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
|
| 221 |
+
"model.layers.26.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
|
| 222 |
+
"model.layers.26.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
|
| 223 |
+
"model.layers.26.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
|
| 224 |
+
"model.layers.26.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
|
| 225 |
+
"model.layers.26.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
|
| 226 |
+
"model.layers.26.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
|
| 227 |
+
"model.layers.26.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
|
| 228 |
+
"model.layers.26.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
|
| 229 |
+
"model.layers.27.input_layernorm.weight": "model-00003-of-00004.safetensors",
|
| 230 |
+
"model.layers.27.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
|
| 231 |
+
"model.layers.27.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
|
| 232 |
+
"model.layers.27.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
|
| 233 |
+
"model.layers.27.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
|
| 234 |
+
"model.layers.27.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
|
| 235 |
+
"model.layers.27.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
|
| 236 |
+
"model.layers.27.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
|
| 237 |
+
"model.layers.27.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
|
| 238 |
+
"model.layers.27.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
|
| 239 |
+
"model.layers.27.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
|
| 240 |
+
"model.layers.28.input_layernorm.weight": "model-00003-of-00004.safetensors",
|
| 241 |
+
"model.layers.28.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
|
| 242 |
+
"model.layers.28.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
|
| 243 |
+
"model.layers.28.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
|
| 244 |
+
"model.layers.28.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
|
| 245 |
+
"model.layers.28.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
|
| 246 |
+
"model.layers.28.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
|
| 247 |
+
"model.layers.28.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
|
| 248 |
+
"model.layers.28.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
|
| 249 |
+
"model.layers.28.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
|
| 250 |
+
"model.layers.28.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
|
| 251 |
+
"model.layers.29.input_layernorm.weight": "model-00003-of-00004.safetensors",
|
| 252 |
+
"model.layers.29.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
|
| 253 |
+
"model.layers.29.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
|
| 254 |
+
"model.layers.29.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
|
| 255 |
+
"model.layers.29.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
|
| 256 |
+
"model.layers.29.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
|
| 257 |
+
"model.layers.29.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
|
| 258 |
+
"model.layers.29.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
|
| 259 |
+
"model.layers.29.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
|
| 260 |
+
"model.layers.29.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
|
| 261 |
+
"model.layers.29.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
|
| 262 |
+
"model.layers.3.input_layernorm.weight": "model-00001-of-00004.safetensors",
|
| 263 |
+
"model.layers.3.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
|
| 264 |
+
"model.layers.3.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
|
| 265 |
+
"model.layers.3.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
|
| 266 |
+
"model.layers.3.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
|
| 267 |
+
"model.layers.3.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
|
| 268 |
+
"model.layers.3.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
|
| 269 |
+
"model.layers.3.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
|
| 270 |
+
"model.layers.3.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
|
| 271 |
+
"model.layers.3.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
|
| 272 |
+
"model.layers.3.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
|
| 273 |
+
"model.layers.30.input_layernorm.weight": "model-00003-of-00004.safetensors",
|
| 274 |
+
"model.layers.30.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
|
| 275 |
+
"model.layers.30.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
|
| 276 |
+
"model.layers.30.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
|
| 277 |
+
"model.layers.30.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
|
| 278 |
+
"model.layers.30.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
|
| 279 |
+
"model.layers.30.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
|
| 280 |
+
"model.layers.30.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
|
| 281 |
+
"model.layers.30.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
|
| 282 |
+
"model.layers.30.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
|
| 283 |
+
"model.layers.30.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
|
| 284 |
+
"model.layers.31.input_layernorm.weight": "model-00003-of-00004.safetensors",
|
| 285 |
+
"model.layers.31.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
|
| 286 |
+
"model.layers.31.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
|
| 287 |
+
"model.layers.31.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
|
| 288 |
+
"model.layers.31.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
|
| 289 |
+
"model.layers.31.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
|
| 290 |
+
"model.layers.31.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
|
| 291 |
+
"model.layers.31.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
|
| 292 |
+
"model.layers.31.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
|
| 293 |
+
"model.layers.31.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
|
| 294 |
+
"model.layers.31.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
|
| 295 |
+
"model.layers.32.input_layernorm.weight": "model-00003-of-00004.safetensors",
|
| 296 |
+
"model.layers.32.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
|
| 297 |
+
"model.layers.32.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
|
| 298 |
+
"model.layers.32.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
|
| 299 |
+
"model.layers.32.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
|
| 300 |
+
"model.layers.32.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
|
| 301 |
+
"model.layers.32.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
|
| 302 |
+
"model.layers.32.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
|
| 303 |
+
"model.layers.32.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
|
| 304 |
+
"model.layers.32.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
|
| 305 |
+
"model.layers.32.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
|
| 306 |
+
"model.layers.33.input_layernorm.weight": "model-00003-of-00004.safetensors",
|
| 307 |
+
"model.layers.33.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
|
| 308 |
+
"model.layers.33.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
|
| 309 |
+
"model.layers.33.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
|
| 310 |
+
"model.layers.33.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
|
| 311 |
+
"model.layers.33.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
|
| 312 |
+
"model.layers.33.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
|
| 313 |
+
"model.layers.33.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
|
| 314 |
+
"model.layers.33.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
|
| 315 |
+
"model.layers.33.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
|
| 316 |
+
"model.layers.33.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
|
| 317 |
+
"model.layers.34.input_layernorm.weight": "model-00003-of-00004.safetensors",
|
| 318 |
+
"model.layers.34.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
|
| 319 |
+
"model.layers.34.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
|
| 320 |
+
"model.layers.34.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
|
| 321 |
+
"model.layers.34.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
|
| 322 |
+
"model.layers.34.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
|
| 323 |
+
"model.layers.34.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
|
| 324 |
+
"model.layers.34.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
|
| 325 |
+
"model.layers.34.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
|
| 326 |
+
"model.layers.34.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
|
| 327 |
+
"model.layers.34.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
|
| 328 |
+
"model.layers.35.input_layernorm.weight": "model-00004-of-00004.safetensors",
|
| 329 |
+
"model.layers.35.mlp.down_proj.weight": "model-00004-of-00004.safetensors",
|
| 330 |
+
"model.layers.35.mlp.gate_proj.weight": "model-00004-of-00004.safetensors",
|
| 331 |
+
"model.layers.35.mlp.up_proj.weight": "model-00004-of-00004.safetensors",
|
| 332 |
+
"model.layers.35.post_attention_layernorm.weight": "model-00004-of-00004.safetensors",
|
| 333 |
+
"model.layers.35.self_attn.k_norm.weight": "model-00004-of-00004.safetensors",
|
| 334 |
+
"model.layers.35.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
|
| 335 |
+
"model.layers.35.self_attn.o_proj.weight": "model-00004-of-00004.safetensors",
|
| 336 |
+
"model.layers.35.self_attn.q_norm.weight": "model-00004-of-00004.safetensors",
|
| 337 |
+
"model.layers.35.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
|
| 338 |
+
"model.layers.35.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
|
| 339 |
+
"model.layers.4.input_layernorm.weight": "model-00001-of-00004.safetensors",
|
| 340 |
+
"model.layers.4.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
|
| 341 |
+
"model.layers.4.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
|
| 342 |
+
"model.layers.4.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
|
| 343 |
+
"model.layers.4.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
|
| 344 |
+
"model.layers.4.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
|
| 345 |
+
"model.layers.4.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
|
| 346 |
+
"model.layers.4.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
|
| 347 |
+
"model.layers.4.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
|
| 348 |
+
"model.layers.4.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
|
| 349 |
+
"model.layers.4.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
|
| 350 |
+
"model.layers.5.input_layernorm.weight": "model-00001-of-00004.safetensors",
|
| 351 |
+
"model.layers.5.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
|
| 352 |
+
"model.layers.5.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
|
| 353 |
+
"model.layers.5.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
|
| 354 |
+
"model.layers.5.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
|
| 355 |
+
"model.layers.5.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
|
| 356 |
+
"model.layers.5.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
|
| 357 |
+
"model.layers.5.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
|
| 358 |
+
"model.layers.5.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
|
| 359 |
+
"model.layers.5.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
|
| 360 |
+
"model.layers.5.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
|
| 361 |
+
"model.layers.6.input_layernorm.weight": "model-00001-of-00004.safetensors",
|
| 362 |
+
"model.layers.6.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
|
| 363 |
+
"model.layers.6.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
|
| 364 |
+
"model.layers.6.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
|
| 365 |
+
"model.layers.6.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
|
| 366 |
+
"model.layers.6.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
|
| 367 |
+
"model.layers.6.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
|
| 368 |
+
"model.layers.6.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
|
| 369 |
+
"model.layers.6.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
|
| 370 |
+
"model.layers.6.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
|
| 371 |
+
"model.layers.6.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
|
| 372 |
+
"model.layers.7.input_layernorm.weight": "model-00001-of-00004.safetensors",
|
| 373 |
+
"model.layers.7.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
|
| 374 |
+
"model.layers.7.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
|
| 375 |
+
"model.layers.7.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
|
| 376 |
+
"model.layers.7.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
|
| 377 |
+
"model.layers.7.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
|
| 378 |
+
"model.layers.7.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
|
| 379 |
+
"model.layers.7.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
|
| 380 |
+
"model.layers.7.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
|
| 381 |
+
"model.layers.7.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
|
| 382 |
+
"model.layers.7.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
|
| 383 |
+
"model.layers.8.input_layernorm.weight": "model-00001-of-00004.safetensors",
|
| 384 |
+
"model.layers.8.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
|
| 385 |
+
"model.layers.8.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
|
| 386 |
+
"model.layers.8.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
|
| 387 |
+
"model.layers.8.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
|
| 388 |
+
"model.layers.8.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
|
| 389 |
+
"model.layers.8.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
|
| 390 |
+
"model.layers.8.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
|
| 391 |
+
"model.layers.8.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
|
| 392 |
+
"model.layers.8.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
|
| 393 |
+
"model.layers.8.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
|
| 394 |
+
"model.layers.9.input_layernorm.weight": "model-00002-of-00004.safetensors",
|
| 395 |
+
"model.layers.9.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
|
| 396 |
+
"model.layers.9.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
|
| 397 |
+
"model.layers.9.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
|
| 398 |
+
"model.layers.9.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
|
| 399 |
+
"model.layers.9.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
|
| 400 |
+
"model.layers.9.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
|
| 401 |
+
"model.layers.9.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
|
| 402 |
+
"model.layers.9.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
|
| 403 |
+
"model.layers.9.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
|
| 404 |
+
"model.layers.9.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
|
| 405 |
+
"model.norm.weight": "model-00004-of-00004.safetensors"
|
| 406 |
+
}
|
| 407 |
+
}
|
run_summary.json
ADDED
|
@@ -0,0 +1,12 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"agent_name": null,
|
| 3 |
+
"training_start": null,
|
| 4 |
+
"training_end": null,
|
| 5 |
+
"created_by": "DCAgent",
|
| 6 |
+
"base_model_name": "Qwen/Qwen3-8B",
|
| 7 |
+
"dataset_name": "DCAgent2/claude-4-5-sonnet-thinking-stackexchange-overflow-32ep-32k-traces",
|
| 8 |
+
"training_type": "SFT",
|
| 9 |
+
"training_parameters": "https://huggingface.co/laion/claude-4-5-sonnet-thinking-stackexchange-overflow-32ep-32k-traces/blob/main/config.json",
|
| 10 |
+
"wandb_link": "https://wandb.ai/dogml/dc-agent/runs/claude-4-5-sonnet-thinking-stackexchange-overflow-32ep-32k-traces_Qwen3-8B",
|
| 11 |
+
"traces_location_s3": null
|
| 12 |
+
}
|
special_tokens_map.json
ADDED
|
@@ -0,0 +1,31 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"additional_special_tokens": [
|
| 3 |
+
"<|im_start|>",
|
| 4 |
+
"<|im_end|>",
|
| 5 |
+
"<|object_ref_start|>",
|
| 6 |
+
"<|object_ref_end|>",
|
| 7 |
+
"<|box_start|>",
|
| 8 |
+
"<|box_end|>",
|
| 9 |
+
"<|quad_start|>",
|
| 10 |
+
"<|quad_end|>",
|
| 11 |
+
"<|vision_start|>",
|
| 12 |
+
"<|vision_end|>",
|
| 13 |
+
"<|vision_pad|>",
|
| 14 |
+
"<|image_pad|>",
|
| 15 |
+
"<|video_pad|>"
|
| 16 |
+
],
|
| 17 |
+
"eos_token": {
|
| 18 |
+
"content": "<|im_end|>",
|
| 19 |
+
"lstrip": false,
|
| 20 |
+
"normalized": false,
|
| 21 |
+
"rstrip": false,
|
| 22 |
+
"single_word": false
|
| 23 |
+
},
|
| 24 |
+
"pad_token": {
|
| 25 |
+
"content": "<|endoftext|>",
|
| 26 |
+
"lstrip": false,
|
| 27 |
+
"normalized": false,
|
| 28 |
+
"rstrip": false,
|
| 29 |
+
"single_word": false
|
| 30 |
+
}
|
| 31 |
+
}
|
tokenizer.json
ADDED
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:aeb13307a71acd8fe81861d94ad54ab689df773318809eed3cbe794b4492dae4
|
| 3 |
+
size 11422654
|
tokenizer_config.json
ADDED
|
@@ -0,0 +1,240 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"add_bos_token": false,
|
| 3 |
+
"add_prefix_space": false,
|
| 4 |
+
"added_tokens_decoder": {
|
| 5 |
+
"151643": {
|
| 6 |
+
"content": "<|endoftext|>",
|
| 7 |
+
"lstrip": false,
|
| 8 |
+
"normalized": false,
|
| 9 |
+
"rstrip": false,
|
| 10 |
+
"single_word": false,
|
| 11 |
+
"special": true
|
| 12 |
+
},
|
| 13 |
+
"151644": {
|
| 14 |
+
"content": "<|im_start|>",
|
| 15 |
+
"lstrip": false,
|
| 16 |
+
"normalized": false,
|
| 17 |
+
"rstrip": false,
|
| 18 |
+
"single_word": false,
|
| 19 |
+
"special": true
|
| 20 |
+
},
|
| 21 |
+
"151645": {
|
| 22 |
+
"content": "<|im_end|>",
|
| 23 |
+
"lstrip": false,
|
| 24 |
+
"normalized": false,
|
| 25 |
+
"rstrip": false,
|
| 26 |
+
"single_word": false,
|
| 27 |
+
"special": true
|
| 28 |
+
},
|
| 29 |
+
"151646": {
|
| 30 |
+
"content": "<|object_ref_start|>",
|
| 31 |
+
"lstrip": false,
|
| 32 |
+
"normalized": false,
|
| 33 |
+
"rstrip": false,
|
| 34 |
+
"single_word": false,
|
| 35 |
+
"special": true
|
| 36 |
+
},
|
| 37 |
+
"151647": {
|
| 38 |
+
"content": "<|object_ref_end|>",
|
| 39 |
+
"lstrip": false,
|
| 40 |
+
"normalized": false,
|
| 41 |
+
"rstrip": false,
|
| 42 |
+
"single_word": false,
|
| 43 |
+
"special": true
|
| 44 |
+
},
|
| 45 |
+
"151648": {
|
| 46 |
+
"content": "<|box_start|>",
|
| 47 |
+
"lstrip": false,
|
| 48 |
+
"normalized": false,
|
| 49 |
+
"rstrip": false,
|
| 50 |
+
"single_word": false,
|
| 51 |
+
"special": true
|
| 52 |
+
},
|
| 53 |
+
"151649": {
|
| 54 |
+
"content": "<|box_end|>",
|
| 55 |
+
"lstrip": false,
|
| 56 |
+
"normalized": false,
|
| 57 |
+
"rstrip": false,
|
| 58 |
+
"single_word": false,
|
| 59 |
+
"special": true
|
| 60 |
+
},
|
| 61 |
+
"151650": {
|
| 62 |
+
"content": "<|quad_start|>",
|
| 63 |
+
"lstrip": false,
|
| 64 |
+
"normalized": false,
|
| 65 |
+
"rstrip": false,
|
| 66 |
+
"single_word": false,
|
| 67 |
+
"special": true
|
| 68 |
+
},
|
| 69 |
+
"151651": {
|
| 70 |
+
"content": "<|quad_end|>",
|
| 71 |
+
"lstrip": false,
|
| 72 |
+
"normalized": false,
|
| 73 |
+
"rstrip": false,
|
| 74 |
+
"single_word": false,
|
| 75 |
+
"special": true
|
| 76 |
+
},
|
| 77 |
+
"151652": {
|
| 78 |
+
"content": "<|vision_start|>",
|
| 79 |
+
"lstrip": false,
|
| 80 |
+
"normalized": false,
|
| 81 |
+
"rstrip": false,
|
| 82 |
+
"single_word": false,
|
| 83 |
+
"special": true
|
| 84 |
+
},
|
| 85 |
+
"151653": {
|
| 86 |
+
"content": "<|vision_end|>",
|
| 87 |
+
"lstrip": false,
|
| 88 |
+
"normalized": false,
|
| 89 |
+
"rstrip": false,
|
| 90 |
+
"single_word": false,
|
| 91 |
+
"special": true
|
| 92 |
+
},
|
| 93 |
+
"151654": {
|
| 94 |
+
"content": "<|vision_pad|>",
|
| 95 |
+
"lstrip": false,
|
| 96 |
+
"normalized": false,
|
| 97 |
+
"rstrip": false,
|
| 98 |
+
"single_word": false,
|
| 99 |
+
"special": true
|
| 100 |
+
},
|
| 101 |
+
"151655": {
|
| 102 |
+
"content": "<|image_pad|>",
|
| 103 |
+
"lstrip": false,
|
| 104 |
+
"normalized": false,
|
| 105 |
+
"rstrip": false,
|
| 106 |
+
"single_word": false,
|
| 107 |
+
"special": true
|
| 108 |
+
},
|
| 109 |
+
"151656": {
|
| 110 |
+
"content": "<|video_pad|>",
|
| 111 |
+
"lstrip": false,
|
| 112 |
+
"normalized": false,
|
| 113 |
+
"rstrip": false,
|
| 114 |
+
"single_word": false,
|
| 115 |
+
"special": true
|
| 116 |
+
},
|
| 117 |
+
"151657": {
|
| 118 |
+
"content": "<tool_call>",
|
| 119 |
+
"lstrip": false,
|
| 120 |
+
"normalized": false,
|
| 121 |
+
"rstrip": false,
|
| 122 |
+
"single_word": false,
|
| 123 |
+
"special": false
|
| 124 |
+
},
|
| 125 |
+
"151658": {
|
| 126 |
+
"content": "</tool_call>",
|
| 127 |
+
"lstrip": false,
|
| 128 |
+
"normalized": false,
|
| 129 |
+
"rstrip": false,
|
| 130 |
+
"single_word": false,
|
| 131 |
+
"special": false
|
| 132 |
+
},
|
| 133 |
+
"151659": {
|
| 134 |
+
"content": "<|fim_prefix|>",
|
| 135 |
+
"lstrip": false,
|
| 136 |
+
"normalized": false,
|
| 137 |
+
"rstrip": false,
|
| 138 |
+
"single_word": false,
|
| 139 |
+
"special": false
|
| 140 |
+
},
|
| 141 |
+
"151660": {
|
| 142 |
+
"content": "<|fim_middle|>",
|
| 143 |
+
"lstrip": false,
|
| 144 |
+
"normalized": false,
|
| 145 |
+
"rstrip": false,
|
| 146 |
+
"single_word": false,
|
| 147 |
+
"special": false
|
| 148 |
+
},
|
| 149 |
+
"151661": {
|
| 150 |
+
"content": "<|fim_suffix|>",
|
| 151 |
+
"lstrip": false,
|
| 152 |
+
"normalized": false,
|
| 153 |
+
"rstrip": false,
|
| 154 |
+
"single_word": false,
|
| 155 |
+
"special": false
|
| 156 |
+
},
|
| 157 |
+
"151662": {
|
| 158 |
+
"content": "<|fim_pad|>",
|
| 159 |
+
"lstrip": false,
|
| 160 |
+
"normalized": false,
|
| 161 |
+
"rstrip": false,
|
| 162 |
+
"single_word": false,
|
| 163 |
+
"special": false
|
| 164 |
+
},
|
| 165 |
+
"151663": {
|
| 166 |
+
"content": "<|repo_name|>",
|
| 167 |
+
"lstrip": false,
|
| 168 |
+
"normalized": false,
|
| 169 |
+
"rstrip": false,
|
| 170 |
+
"single_word": false,
|
| 171 |
+
"special": false
|
| 172 |
+
},
|
| 173 |
+
"151664": {
|
| 174 |
+
"content": "<|file_sep|>",
|
| 175 |
+
"lstrip": false,
|
| 176 |
+
"normalized": false,
|
| 177 |
+
"rstrip": false,
|
| 178 |
+
"single_word": false,
|
| 179 |
+
"special": false
|
| 180 |
+
},
|
| 181 |
+
"151665": {
|
| 182 |
+
"content": "<tool_response>",
|
| 183 |
+
"lstrip": false,
|
| 184 |
+
"normalized": false,
|
| 185 |
+
"rstrip": false,
|
| 186 |
+
"single_word": false,
|
| 187 |
+
"special": false
|
| 188 |
+
},
|
| 189 |
+
"151666": {
|
| 190 |
+
"content": "</tool_response>",
|
| 191 |
+
"lstrip": false,
|
| 192 |
+
"normalized": false,
|
| 193 |
+
"rstrip": false,
|
| 194 |
+
"single_word": false,
|
| 195 |
+
"special": false
|
| 196 |
+
},
|
| 197 |
+
"151667": {
|
| 198 |
+
"content": "<think>",
|
| 199 |
+
"lstrip": false,
|
| 200 |
+
"normalized": false,
|
| 201 |
+
"rstrip": false,
|
| 202 |
+
"single_word": false,
|
| 203 |
+
"special": false
|
| 204 |
+
},
|
| 205 |
+
"151668": {
|
| 206 |
+
"content": "</think>",
|
| 207 |
+
"lstrip": false,
|
| 208 |
+
"normalized": false,
|
| 209 |
+
"rstrip": false,
|
| 210 |
+
"single_word": false,
|
| 211 |
+
"special": false
|
| 212 |
+
}
|
| 213 |
+
},
|
| 214 |
+
"additional_special_tokens": [
|
| 215 |
+
"<|im_start|>",
|
| 216 |
+
"<|im_end|>",
|
| 217 |
+
"<|object_ref_start|>",
|
| 218 |
+
"<|object_ref_end|>",
|
| 219 |
+
"<|box_start|>",
|
| 220 |
+
"<|box_end|>",
|
| 221 |
+
"<|quad_start|>",
|
| 222 |
+
"<|quad_end|>",
|
| 223 |
+
"<|vision_start|>",
|
| 224 |
+
"<|vision_end|>",
|
| 225 |
+
"<|vision_pad|>",
|
| 226 |
+
"<|image_pad|>",
|
| 227 |
+
"<|video_pad|>"
|
| 228 |
+
],
|
| 229 |
+
"bos_token": null,
|
| 230 |
+
"clean_up_tokenization_spaces": false,
|
| 231 |
+
"eos_token": "<|im_end|>",
|
| 232 |
+
"errors": "replace",
|
| 233 |
+
"extra_special_tokens": {},
|
| 234 |
+
"model_max_length": 32768,
|
| 235 |
+
"pad_token": "<|endoftext|>",
|
| 236 |
+
"padding_side": "right",
|
| 237 |
+
"split_special_tokens": false,
|
| 238 |
+
"tokenizer_class": "Qwen2Tokenizer",
|
| 239 |
+
"unk_token": null
|
| 240 |
+
}
|
train_results.json
ADDED
|
@@ -0,0 +1,16 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"achieved_tflops_per_gpu": 2.7067145335487743,
|
| 3 |
+
"achieved_tflops_per_gpu_theoretical": 88.08551863340236,
|
| 4 |
+
"epoch": 7.0,
|
| 5 |
+
"loss_nan_ranks": 0,
|
| 6 |
+
"loss_rank_avg": 0.17071497440338135,
|
| 7 |
+
"mfu_percent": 0.867536709470761,
|
| 8 |
+
"mfu_percent_theoretical": 28.232538023526395,
|
| 9 |
+
"total_flos": 1.3627384946606735e+18,
|
| 10 |
+
"train_loss": 0.19606392417490914,
|
| 11 |
+
"train_runtime": 62933.2387,
|
| 12 |
+
"train_samples_per_second": 0.419,
|
| 13 |
+
"train_steps_per_second": 0.026,
|
| 14 |
+
"valid_targets_mean": 16957.1,
|
| 15 |
+
"valid_targets_min": 7571
|
| 16 |
+
}
|
trainer_log.jsonl
ADDED
|
@@ -0,0 +1,331 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{"current_steps": 5, "total_steps": 1652, "loss": 0.5153, "lr": 9.638554216867472e-07, "epoch": 0.021231422505307854, "percentage": 0.3, "elapsed_time": "0:03:18", "remaining_time": "18:09:11"}
|
| 2 |
+
{"current_steps": 10, "total_steps": 1652, "loss": 0.4921, "lr": 2.168674698795181e-06, "epoch": 0.04246284501061571, "percentage": 0.61, "elapsed_time": "0:06:27", "remaining_time": "17:39:08"}
|
| 3 |
+
{"current_steps": 15, "total_steps": 1652, "loss": 0.4234, "lr": 3.3734939759036146e-06, "epoch": 0.06369426751592357, "percentage": 0.91, "elapsed_time": "0:09:36", "remaining_time": "17:29:09"}
|
| 4 |
+
{"current_steps": 20, "total_steps": 1652, "loss": 0.4198, "lr": 4.578313253012049e-06, "epoch": 0.08492569002123142, "percentage": 1.21, "elapsed_time": "0:12:46", "remaining_time": "17:22:24"}
|
| 5 |
+
{"current_steps": 25, "total_steps": 1652, "loss": 0.4121, "lr": 5.783132530120482e-06, "epoch": 0.10615711252653928, "percentage": 1.51, "elapsed_time": "0:15:55", "remaining_time": "17:16:10"}
|
| 6 |
+
{"current_steps": 30, "total_steps": 1652, "loss": 0.3847, "lr": 6.987951807228917e-06, "epoch": 0.12738853503184713, "percentage": 1.82, "elapsed_time": "0:19:03", "remaining_time": "17:10:25"}
|
| 7 |
+
{"current_steps": 35, "total_steps": 1652, "loss": 0.3696, "lr": 8.19277108433735e-06, "epoch": 0.14861995753715498, "percentage": 2.12, "elapsed_time": "0:22:11", "remaining_time": "17:05:36"}
|
| 8 |
+
{"current_steps": 40, "total_steps": 1652, "loss": 0.3606, "lr": 9.397590361445785e-06, "epoch": 0.16985138004246284, "percentage": 2.42, "elapsed_time": "0:25:20", "remaining_time": "17:01:23"}
|
| 9 |
+
{"current_steps": 45, "total_steps": 1652, "loss": 0.3188, "lr": 1.0602409638554219e-05, "epoch": 0.1910828025477707, "percentage": 2.72, "elapsed_time": "0:28:29", "remaining_time": "16:57:21"}
|
| 10 |
+
{"current_steps": 50, "total_steps": 1652, "loss": 0.3358, "lr": 1.1807228915662651e-05, "epoch": 0.21231422505307856, "percentage": 3.03, "elapsed_time": "0:31:37", "remaining_time": "16:53:08"}
|
| 11 |
+
{"current_steps": 55, "total_steps": 1652, "loss": 0.3134, "lr": 1.3012048192771085e-05, "epoch": 0.23354564755838642, "percentage": 3.33, "elapsed_time": "0:34:46", "remaining_time": "16:49:46"}
|
| 12 |
+
{"current_steps": 60, "total_steps": 1652, "loss": 0.2825, "lr": 1.4216867469879519e-05, "epoch": 0.25477707006369427, "percentage": 3.63, "elapsed_time": "0:37:55", "remaining_time": "16:46:25"}
|
| 13 |
+
{"current_steps": 65, "total_steps": 1652, "loss": 0.269, "lr": 1.5421686746987955e-05, "epoch": 0.2760084925690021, "percentage": 3.93, "elapsed_time": "0:41:05", "remaining_time": "16:43:26"}
|
| 14 |
+
{"current_steps": 70, "total_steps": 1652, "loss": 0.2876, "lr": 1.6626506024096387e-05, "epoch": 0.29723991507430997, "percentage": 4.24, "elapsed_time": "0:44:15", "remaining_time": "16:40:09"}
|
| 15 |
+
{"current_steps": 75, "total_steps": 1652, "loss": 0.3045, "lr": 1.783132530120482e-05, "epoch": 0.3184713375796178, "percentage": 4.54, "elapsed_time": "0:47:24", "remaining_time": "16:36:47"}
|
| 16 |
+
{"current_steps": 80, "total_steps": 1652, "loss": 0.2576, "lr": 1.9036144578313255e-05, "epoch": 0.33970276008492567, "percentage": 4.84, "elapsed_time": "0:50:34", "remaining_time": "16:33:39"}
|
| 17 |
+
{"current_steps": 85, "total_steps": 1652, "loss": 0.2864, "lr": 2.0240963855421687e-05, "epoch": 0.3609341825902335, "percentage": 5.15, "elapsed_time": "0:53:43", "remaining_time": "16:30:26"}
|
| 18 |
+
{"current_steps": 90, "total_steps": 1652, "loss": 0.2537, "lr": 2.1445783132530123e-05, "epoch": 0.3821656050955414, "percentage": 5.45, "elapsed_time": "0:56:53", "remaining_time": "16:27:19"}
|
| 19 |
+
{"current_steps": 95, "total_steps": 1652, "loss": 0.2792, "lr": 2.265060240963856e-05, "epoch": 0.4033970276008493, "percentage": 5.75, "elapsed_time": "1:00:01", "remaining_time": "16:23:51"}
|
| 20 |
+
{"current_steps": 100, "total_steps": 1652, "loss": 0.2496, "lr": 2.3855421686746988e-05, "epoch": 0.42462845010615713, "percentage": 6.05, "elapsed_time": "1:03:10", "remaining_time": "16:20:34"}
|
| 21 |
+
{"current_steps": 105, "total_steps": 1652, "loss": 0.2674, "lr": 2.5060240963855423e-05, "epoch": 0.445859872611465, "percentage": 6.36, "elapsed_time": "1:06:20", "remaining_time": "16:17:19"}
|
| 22 |
+
{"current_steps": 110, "total_steps": 1652, "loss": 0.2563, "lr": 2.6265060240963856e-05, "epoch": 0.46709129511677283, "percentage": 6.66, "elapsed_time": "1:09:28", "remaining_time": "16:14:00"}
|
| 23 |
+
{"current_steps": 115, "total_steps": 1652, "loss": 0.2934, "lr": 2.746987951807229e-05, "epoch": 0.4883227176220807, "percentage": 6.96, "elapsed_time": "1:12:36", "remaining_time": "16:10:28"}
|
| 24 |
+
{"current_steps": 120, "total_steps": 1652, "loss": 0.2538, "lr": 2.8674698795180727e-05, "epoch": 0.5095541401273885, "percentage": 7.26, "elapsed_time": "1:15:45", "remaining_time": "16:07:14"}
|
| 25 |
+
{"current_steps": 125, "total_steps": 1652, "loss": 0.2489, "lr": 2.9879518072289156e-05, "epoch": 0.5307855626326964, "percentage": 7.57, "elapsed_time": "1:18:55", "remaining_time": "16:04:07"}
|
| 26 |
+
{"current_steps": 130, "total_steps": 1652, "loss": 0.256, "lr": 3.108433734939759e-05, "epoch": 0.5520169851380042, "percentage": 7.87, "elapsed_time": "1:22:04", "remaining_time": "16:00:55"}
|
| 27 |
+
{"current_steps": 135, "total_steps": 1652, "loss": 0.2449, "lr": 3.228915662650603e-05, "epoch": 0.5732484076433121, "percentage": 8.17, "elapsed_time": "1:25:13", "remaining_time": "15:57:43"}
|
| 28 |
+
{"current_steps": 140, "total_steps": 1652, "loss": 0.2345, "lr": 3.3493975903614457e-05, "epoch": 0.5944798301486199, "percentage": 8.47, "elapsed_time": "1:28:22", "remaining_time": "15:54:26"}
|
| 29 |
+
{"current_steps": 145, "total_steps": 1652, "loss": 0.2754, "lr": 3.4698795180722896e-05, "epoch": 0.6157112526539278, "percentage": 8.78, "elapsed_time": "1:31:30", "remaining_time": "15:51:01"}
|
| 30 |
+
{"current_steps": 150, "total_steps": 1652, "loss": 0.2422, "lr": 3.590361445783133e-05, "epoch": 0.6369426751592356, "percentage": 9.08, "elapsed_time": "1:34:38", "remaining_time": "15:47:45"}
|
| 31 |
+
{"current_steps": 155, "total_steps": 1652, "loss": 0.2485, "lr": 3.710843373493976e-05, "epoch": 0.6581740976645435, "percentage": 9.38, "elapsed_time": "1:37:48", "remaining_time": "15:44:34"}
|
| 32 |
+
{"current_steps": 160, "total_steps": 1652, "loss": 0.2405, "lr": 3.83132530120482e-05, "epoch": 0.6794055201698513, "percentage": 9.69, "elapsed_time": "1:40:57", "remaining_time": "15:41:22"}
|
| 33 |
+
{"current_steps": 165, "total_steps": 1652, "loss": 0.28, "lr": 3.9518072289156625e-05, "epoch": 0.7006369426751592, "percentage": 9.99, "elapsed_time": "1:44:06", "remaining_time": "15:38:11"}
|
| 34 |
+
{"current_steps": 170, "total_steps": 1652, "loss": 0.2259, "lr": 3.9999597743398453e-05, "epoch": 0.721868365180467, "percentage": 10.29, "elapsed_time": "1:47:15", "remaining_time": "15:35:04"}
|
| 35 |
+
{"current_steps": 175, "total_steps": 1652, "loss": 0.2501, "lr": 3.999713956720898e-05, "epoch": 0.7430997876857749, "percentage": 10.59, "elapsed_time": "1:50:24", "remaining_time": "15:31:51"}
|
| 36 |
+
{"current_steps": 180, "total_steps": 1652, "loss": 0.2561, "lr": 3.9992446965056756e-05, "epoch": 0.7643312101910829, "percentage": 10.9, "elapsed_time": "1:53:33", "remaining_time": "15:28:40"}
|
| 37 |
+
{"current_steps": 185, "total_steps": 1652, "loss": 0.2604, "lr": 3.998552046128038e-05, "epoch": 0.7855626326963907, "percentage": 11.2, "elapsed_time": "1:56:43", "remaining_time": "15:25:32"}
|
| 38 |
+
{"current_steps": 190, "total_steps": 1652, "loss": 0.2231, "lr": 3.997636082982853e-05, "epoch": 0.8067940552016986, "percentage": 11.5, "elapsed_time": "1:59:52", "remaining_time": "15:22:25"}
|
| 39 |
+
{"current_steps": 195, "total_steps": 1652, "loss": 0.2456, "lr": 3.9964969094173506e-05, "epoch": 0.8280254777070064, "percentage": 11.8, "elapsed_time": "2:03:01", "remaining_time": "15:19:15"}
|
| 40 |
+
{"current_steps": 200, "total_steps": 1652, "loss": 0.2378, "lr": 3.995134652719684e-05, "epoch": 0.8492569002123143, "percentage": 12.11, "elapsed_time": "2:06:10", "remaining_time": "15:16:04"}
|
| 41 |
+
{"current_steps": 205, "total_steps": 1652, "loss": 0.212, "lr": 3.993549465104712e-05, "epoch": 0.8704883227176221, "percentage": 12.41, "elapsed_time": "2:10:32", "remaining_time": "15:21:25"}
|
| 42 |
+
{"current_steps": 210, "total_steps": 1652, "loss": 0.2266, "lr": 3.991741523696984e-05, "epoch": 0.89171974522293, "percentage": 12.71, "elapsed_time": "2:13:41", "remaining_time": "15:17:59"}
|
| 43 |
+
{"current_steps": 215, "total_steps": 1652, "loss": 0.2398, "lr": 3.989711030510954e-05, "epoch": 0.9129511677282378, "percentage": 13.01, "elapsed_time": "2:16:50", "remaining_time": "15:14:33"}
|
| 44 |
+
{"current_steps": 220, "total_steps": 1652, "loss": 0.2243, "lr": 3.987458212428406e-05, "epoch": 0.9341825902335457, "percentage": 13.32, "elapsed_time": "2:19:59", "remaining_time": "15:11:11"}
|
| 45 |
+
{"current_steps": 225, "total_steps": 1652, "loss": 0.2403, "lr": 3.984983321173101e-05, "epoch": 0.9554140127388535, "percentage": 13.62, "elapsed_time": "2:23:07", "remaining_time": "15:07:45"}
|
| 46 |
+
{"current_steps": 230, "total_steps": 1652, "loss": 0.2246, "lr": 3.9822866332826555e-05, "epoch": 0.9766454352441614, "percentage": 13.92, "elapsed_time": "2:26:16", "remaining_time": "15:04:20"}
|
| 47 |
+
{"current_steps": 235, "total_steps": 1652, "loss": 0.2331, "lr": 3.9793684500776356e-05, "epoch": 0.9978768577494692, "percentage": 14.23, "elapsed_time": "2:29:24", "remaining_time": "15:00:56"}
|
| 48 |
+
{"current_steps": 240, "total_steps": 1652, "loss": 0.2037, "lr": 3.976229097627892e-05, "epoch": 1.0169851380042463, "percentage": 14.53, "elapsed_time": "2:32:15", "remaining_time": "14:55:48"}
|
| 49 |
+
{"current_steps": 245, "total_steps": 1652, "loss": 0.1971, "lr": 3.972868926716127e-05, "epoch": 1.0382165605095541, "percentage": 14.83, "elapsed_time": "2:35:25", "remaining_time": "14:52:34"}
|
| 50 |
+
{"current_steps": 250, "total_steps": 1652, "loss": 0.2454, "lr": 3.969288312798693e-05, "epoch": 1.059447983014862, "percentage": 15.13, "elapsed_time": "2:38:34", "remaining_time": "14:49:16"}
|
| 51 |
+
{"current_steps": 255, "total_steps": 1652, "loss": 0.2257, "lr": 3.965487655963647e-05, "epoch": 1.0806794055201698, "percentage": 15.44, "elapsed_time": "2:41:43", "remaining_time": "14:45:58"}
|
| 52 |
+
{"current_steps": 260, "total_steps": 1652, "loss": 0.2189, "lr": 3.961467380886042e-05, "epoch": 1.1019108280254777, "percentage": 15.74, "elapsed_time": "2:44:51", "remaining_time": "14:42:40"}
|
| 53 |
+
{"current_steps": 265, "total_steps": 1652, "loss": 0.2266, "lr": 3.957227936780476e-05, "epoch": 1.1231422505307855, "percentage": 16.04, "elapsed_time": "2:48:02", "remaining_time": "14:39:30"}
|
| 54 |
+
{"current_steps": 270, "total_steps": 1652, "loss": 0.2161, "lr": 3.952769797350899e-05, "epoch": 1.1443736730360934, "percentage": 16.34, "elapsed_time": "2:51:11", "remaining_time": "14:36:12"}
|
| 55 |
+
{"current_steps": 275, "total_steps": 1652, "loss": 0.1904, "lr": 3.948093460737679e-05, "epoch": 1.1656050955414012, "percentage": 16.65, "elapsed_time": "2:54:20", "remaining_time": "14:32:57"}
|
| 56 |
+
{"current_steps": 280, "total_steps": 1652, "loss": 0.2426, "lr": 3.943199449461944e-05, "epoch": 1.186836518046709, "percentage": 16.95, "elapsed_time": "2:57:28", "remaining_time": "14:29:37"}
|
| 57 |
+
{"current_steps": 285, "total_steps": 1652, "loss": 0.2234, "lr": 3.938088310367199e-05, "epoch": 1.208067940552017, "percentage": 17.25, "elapsed_time": "3:00:35", "remaining_time": "14:26:12"}
|
| 58 |
+
{"current_steps": 290, "total_steps": 1652, "loss": 0.2209, "lr": 3.932760614558218e-05, "epoch": 1.2292993630573248, "percentage": 17.55, "elapsed_time": "3:03:44", "remaining_time": "14:22:57"}
|
| 59 |
+
{"current_steps": 295, "total_steps": 1652, "loss": 0.23, "lr": 3.9272169573372345e-05, "epoch": 1.2505307855626326, "percentage": 17.86, "elapsed_time": "3:06:54", "remaining_time": "14:19:45"}
|
| 60 |
+
{"current_steps": 300, "total_steps": 1652, "loss": 0.2649, "lr": 3.921457958137421e-05, "epoch": 1.2717622080679405, "percentage": 18.16, "elapsed_time": "3:10:02", "remaining_time": "14:16:28"}
|
| 61 |
+
{"current_steps": 305, "total_steps": 1652, "loss": 0.238, "lr": 3.915484260453679e-05, "epoch": 1.2929936305732483, "percentage": 18.46, "elapsed_time": "3:13:11", "remaining_time": "14:13:14"}
|
| 62 |
+
{"current_steps": 310, "total_steps": 1652, "loss": 0.2319, "lr": 3.909296531770732e-05, "epoch": 1.3142250530785562, "percentage": 18.77, "elapsed_time": "3:16:21", "remaining_time": "14:10:00"}
|
| 63 |
+
{"current_steps": 315, "total_steps": 1652, "loss": 0.2209, "lr": 3.902895463488547e-05, "epoch": 1.335456475583864, "percentage": 19.07, "elapsed_time": "3:19:29", "remaining_time": "14:06:42"}
|
| 64 |
+
{"current_steps": 320, "total_steps": 1652, "loss": 0.2109, "lr": 3.896281770845076e-05, "epoch": 1.356687898089172, "percentage": 19.37, "elapsed_time": "3:22:37", "remaining_time": "14:03:25"}
|
| 65 |
+
{"current_steps": 325, "total_steps": 1652, "loss": 0.2003, "lr": 3.8894561928363396e-05, "epoch": 1.3779193205944797, "percentage": 19.67, "elapsed_time": "3:25:46", "remaining_time": "14:00:10"}
|
| 66 |
+
{"current_steps": 330, "total_steps": 1652, "loss": 0.2091, "lr": 3.8824194921338516e-05, "epoch": 1.3991507430997876, "percentage": 19.98, "elapsed_time": "3:28:54", "remaining_time": "13:56:54"}
|
| 67 |
+
{"current_steps": 335, "total_steps": 1652, "loss": 0.2081, "lr": 3.875172454999402e-05, "epoch": 1.4203821656050954, "percentage": 20.28, "elapsed_time": "3:32:03", "remaining_time": "13:53:38"}
|
| 68 |
+
{"current_steps": 340, "total_steps": 1652, "loss": 0.2073, "lr": 3.8677158911972e-05, "epoch": 1.4416135881104033, "percentage": 20.58, "elapsed_time": "3:35:12", "remaining_time": "13:50:26"}
|
| 69 |
+
{"current_steps": 345, "total_steps": 1652, "loss": 0.2098, "lr": 3.860050633903395e-05, "epoch": 1.4628450106157111, "percentage": 20.88, "elapsed_time": "3:38:21", "remaining_time": "13:47:13"}
|
| 70 |
+
{"current_steps": 350, "total_steps": 1652, "loss": 0.2233, "lr": 3.8521775396129824e-05, "epoch": 1.484076433121019, "percentage": 21.19, "elapsed_time": "3:41:30", "remaining_time": "13:43:59"}
|
| 71 |
+
{"current_steps": 355, "total_steps": 1652, "loss": 0.2165, "lr": 3.8440974880440925e-05, "epoch": 1.5053078556263269, "percentage": 21.49, "elapsed_time": "3:44:39", "remaining_time": "13:40:47"}
|
| 72 |
+
{"current_steps": 360, "total_steps": 1652, "loss": 0.2098, "lr": 3.835811382039703e-05, "epoch": 1.5265392781316347, "percentage": 21.79, "elapsed_time": "3:47:48", "remaining_time": "13:37:34"}
|
| 73 |
+
{"current_steps": 365, "total_steps": 1652, "loss": 0.2201, "lr": 3.827320147466752e-05, "epoch": 1.5477707006369426, "percentage": 22.09, "elapsed_time": "3:50:57", "remaining_time": "13:34:21"}
|
| 74 |
+
{"current_steps": 370, "total_steps": 1652, "loss": 0.2437, "lr": 3.818624733112687e-05, "epoch": 1.5690021231422504, "percentage": 22.4, "elapsed_time": "3:54:05", "remaining_time": "13:31:05"}
|
| 75 |
+
{"current_steps": 375, "total_steps": 1652, "loss": 0.1932, "lr": 3.809726110579446e-05, "epoch": 1.5902335456475583, "percentage": 22.7, "elapsed_time": "3:57:14", "remaining_time": "13:27:52"}
|
| 76 |
+
{"current_steps": 380, "total_steps": 1652, "loss": 0.227, "lr": 3.8006252741748986e-05, "epoch": 1.611464968152866, "percentage": 23.0, "elapsed_time": "4:00:22", "remaining_time": "13:24:36"}
|
| 77 |
+
{"current_steps": 385, "total_steps": 1652, "loss": 0.2166, "lr": 3.79132324080174e-05, "epoch": 1.632696390658174, "percentage": 23.31, "elapsed_time": "4:03:30", "remaining_time": "13:21:22"}
|
| 78 |
+
{"current_steps": 390, "total_steps": 1652, "loss": 0.1943, "lr": 3.781821049843869e-05, "epoch": 1.6539278131634818, "percentage": 23.61, "elapsed_time": "4:06:39", "remaining_time": "13:18:10"}
|
| 79 |
+
{"current_steps": 395, "total_steps": 1652, "loss": 0.2147, "lr": 3.7721197630502485e-05, "epoch": 1.6751592356687897, "percentage": 23.91, "elapsed_time": "4:09:48", "remaining_time": "13:14:58"}
|
| 80 |
+
{"current_steps": 400, "total_steps": 1652, "loss": 0.2095, "lr": 3.762220464416266e-05, "epoch": 1.6963906581740975, "percentage": 24.21, "elapsed_time": "4:12:58", "remaining_time": "13:11:47"}
|
| 81 |
+
{"current_steps": 405, "total_steps": 1652, "loss": 0.1993, "lr": 3.7521242600626154e-05, "epoch": 1.7176220806794054, "percentage": 24.52, "elapsed_time": "4:17:20", "remaining_time": "13:12:20"}
|
| 82 |
+
{"current_steps": 410, "total_steps": 1652, "loss": 0.2471, "lr": 3.7418322781117e-05, "epoch": 1.7388535031847132, "percentage": 24.82, "elapsed_time": "4:20:28", "remaining_time": "13:09:03"}
|
| 83 |
+
{"current_steps": 415, "total_steps": 1652, "loss": 0.2065, "lr": 3.731345668561577e-05, "epoch": 1.7600849256900213, "percentage": 25.12, "elapsed_time": "4:23:38", "remaining_time": "13:05:49"}
|
| 84 |
+
{"current_steps": 420, "total_steps": 1652, "loss": 0.1736, "lr": 3.720665603157464e-05, "epoch": 1.7813163481953291, "percentage": 25.42, "elapsed_time": "4:26:47", "remaining_time": "13:02:36"}
|
| 85 |
+
{"current_steps": 425, "total_steps": 1652, "loss": 0.219, "lr": 3.7097932752608096e-05, "epoch": 1.802547770700637, "percentage": 25.73, "elapsed_time": "4:29:56", "remaining_time": "12:59:20"}
|
| 86 |
+
{"current_steps": 430, "total_steps": 1652, "loss": 0.1986, "lr": 3.698729899715947e-05, "epoch": 1.8237791932059448, "percentage": 26.03, "elapsed_time": "4:33:06", "remaining_time": "12:56:07"}
|
| 87 |
+
{"current_steps": 435, "total_steps": 1652, "loss": 0.2078, "lr": 3.687476712714358e-05, "epoch": 1.8450106157112527, "percentage": 26.33, "elapsed_time": "4:36:15", "remaining_time": "12:52:53"}
|
| 88 |
+
{"current_steps": 440, "total_steps": 1652, "loss": 0.1934, "lr": 3.676034971656537e-05, "epoch": 1.8662420382165605, "percentage": 26.63, "elapsed_time": "4:39:24", "remaining_time": "12:49:38"}
|
| 89 |
+
{"current_steps": 445, "total_steps": 1652, "loss": 0.1917, "lr": 3.664405955011498e-05, "epoch": 1.8874734607218684, "percentage": 26.94, "elapsed_time": "4:42:33", "remaining_time": "12:46:22"}
|
| 90 |
+
{"current_steps": 450, "total_steps": 1652, "loss": 0.2353, "lr": 3.652590962173917e-05, "epoch": 1.9087048832271762, "percentage": 27.24, "elapsed_time": "4:45:41", "remaining_time": "12:43:05"}
|
| 91 |
+
{"current_steps": 455, "total_steps": 1652, "loss": 0.2091, "lr": 3.640591313318944e-05, "epoch": 1.929936305732484, "percentage": 27.54, "elapsed_time": "4:48:49", "remaining_time": "12:39:50"}
|
| 92 |
+
{"current_steps": 460, "total_steps": 1652, "loss": 0.202, "lr": 3.628408349254693e-05, "epoch": 1.951167728237792, "percentage": 27.85, "elapsed_time": "4:51:58", "remaining_time": "12:36:35"}
|
| 93 |
+
{"current_steps": 465, "total_steps": 1652, "loss": 0.2048, "lr": 3.616043431272417e-05, "epoch": 1.9723991507430998, "percentage": 28.15, "elapsed_time": "4:55:06", "remaining_time": "12:33:19"}
|
| 94 |
+
{"current_steps": 470, "total_steps": 1652, "loss": 0.2125, "lr": 3.603497940994407e-05, "epoch": 1.9936305732484076, "percentage": 28.45, "elapsed_time": "4:58:15", "remaining_time": "12:30:05"}
|
| 95 |
+
{"current_steps": 475, "total_steps": 1652, "loss": 0.1976, "lr": 3.59077328021961e-05, "epoch": 2.0127388535031847, "percentage": 28.75, "elapsed_time": "5:01:06", "remaining_time": "12:26:05"}
|
| 96 |
+
{"current_steps": 480, "total_steps": 1652, "loss": 0.2029, "lr": 3.577870870766997e-05, "epoch": 2.0339702760084926, "percentage": 29.06, "elapsed_time": "5:04:14", "remaining_time": "12:22:51"}
|
| 97 |
+
{"current_steps": 485, "total_steps": 1652, "loss": 0.1978, "lr": 3.5647921543166923e-05, "epoch": 2.0552016985138004, "percentage": 29.36, "elapsed_time": "5:07:23", "remaining_time": "12:19:39"}
|
| 98 |
+
{"current_steps": 490, "total_steps": 1652, "loss": 0.1965, "lr": 3.5515385922488846e-05, "epoch": 2.0764331210191083, "percentage": 29.66, "elapsed_time": "5:10:33", "remaining_time": "12:16:28"}
|
| 99 |
+
{"current_steps": 495, "total_steps": 1652, "loss": 0.1878, "lr": 3.5381116654805375e-05, "epoch": 2.097664543524416, "percentage": 29.96, "elapsed_time": "5:13:43", "remaining_time": "12:13:16"}
|
| 100 |
+
{"current_steps": 500, "total_steps": 1652, "loss": 0.1913, "lr": 3.524512874299912e-05, "epoch": 2.118895966029724, "percentage": 30.27, "elapsed_time": "5:16:52", "remaining_time": "12:10:05"}
|
| 101 |
+
{"current_steps": 505, "total_steps": 1652, "loss": 0.1988, "lr": 3.5107437381989325e-05, "epoch": 2.140127388535032, "percentage": 30.57, "elapsed_time": "5:20:01", "remaining_time": "12:06:52"}
|
| 102 |
+
{"current_steps": 510, "total_steps": 1652, "loss": 0.194, "lr": 3.4968057957034e-05, "epoch": 2.1613588110403397, "percentage": 30.87, "elapsed_time": "5:23:10", "remaining_time": "12:03:39"}
|
| 103 |
+
{"current_steps": 515, "total_steps": 1652, "loss": 0.1947, "lr": 3.482700604201086e-05, "epoch": 2.1825902335456475, "percentage": 31.17, "elapsed_time": "5:26:19", "remaining_time": "12:00:26"}
|
| 104 |
+
{"current_steps": 520, "total_steps": 1652, "loss": 0.2124, "lr": 3.4684297397677064e-05, "epoch": 2.2038216560509554, "percentage": 31.48, "elapsed_time": "5:29:27", "remaining_time": "11:57:12"}
|
| 105 |
+
{"current_steps": 525, "total_steps": 1652, "loss": 0.2005, "lr": 3.453994796990823e-05, "epoch": 2.225053078556263, "percentage": 31.78, "elapsed_time": "5:32:36", "remaining_time": "11:54:00"}
|
| 106 |
+
{"current_steps": 530, "total_steps": 1652, "loss": 0.1842, "lr": 3.439397388791662e-05, "epoch": 2.246284501061571, "percentage": 32.08, "elapsed_time": "5:35:46", "remaining_time": "11:50:48"}
|
| 107 |
+
{"current_steps": 535, "total_steps": 1652, "loss": 0.2108, "lr": 3.424639146244898e-05, "epoch": 2.267515923566879, "percentage": 32.38, "elapsed_time": "5:38:55", "remaining_time": "11:47:37"}
|
| 108 |
+
{"current_steps": 540, "total_steps": 1652, "loss": 0.2073, "lr": 3.409721718396395e-05, "epoch": 2.2887473460721868, "percentage": 32.69, "elapsed_time": "5:42:04", "remaining_time": "11:44:25"}
|
| 109 |
+
{"current_steps": 545, "total_steps": 1652, "loss": 0.2136, "lr": 3.394646772078951e-05, "epoch": 2.3099787685774946, "percentage": 32.99, "elapsed_time": "5:45:13", "remaining_time": "11:41:13"}
|
| 110 |
+
{"current_steps": 550, "total_steps": 1652, "loss": 0.1953, "lr": 3.379415991726047e-05, "epoch": 2.3312101910828025, "percentage": 33.29, "elapsed_time": "5:48:22", "remaining_time": "11:38:00"}
|
| 111 |
+
{"current_steps": 555, "total_steps": 1652, "loss": 0.2001, "lr": 3.3640310791836375e-05, "epoch": 2.3524416135881103, "percentage": 33.6, "elapsed_time": "5:51:31", "remaining_time": "11:34:49"}
|
| 112 |
+
{"current_steps": 560, "total_steps": 1652, "loss": 0.2171, "lr": 3.348493753519987e-05, "epoch": 2.373673036093418, "percentage": 33.9, "elapsed_time": "5:54:38", "remaining_time": "11:31:34"}
|
| 113 |
+
{"current_steps": 565, "total_steps": 1652, "loss": 0.1966, "lr": 3.332805750833588e-05, "epoch": 2.394904458598726, "percentage": 34.2, "elapsed_time": "5:57:47", "remaining_time": "11:28:20"}
|
| 114 |
+
{"current_steps": 570, "total_steps": 1652, "loss": 0.187, "lr": 3.3169688240591735e-05, "epoch": 2.416135881104034, "percentage": 34.5, "elapsed_time": "6:00:56", "remaining_time": "11:25:08"}
|
| 115 |
+
{"current_steps": 575, "total_steps": 1652, "loss": 0.1826, "lr": 3.300984742771849e-05, "epoch": 2.4373673036093417, "percentage": 34.81, "elapsed_time": "6:04:05", "remaining_time": "11:21:57"}
|
| 116 |
+
{"current_steps": 580, "total_steps": 1652, "loss": 0.2016, "lr": 3.284855292989363e-05, "epoch": 2.4585987261146496, "percentage": 35.11, "elapsed_time": "6:07:14", "remaining_time": "11:18:46"}
|
| 117 |
+
{"current_steps": 585, "total_steps": 1652, "loss": 0.1907, "lr": 3.268582276972549e-05, "epoch": 2.4798301486199574, "percentage": 35.41, "elapsed_time": "6:10:23", "remaining_time": "11:15:34"}
|
| 118 |
+
{"current_steps": 590, "total_steps": 1652, "loss": 0.2006, "lr": 3.252167513023934e-05, "epoch": 2.5010615711252653, "percentage": 35.71, "elapsed_time": "6:13:33", "remaining_time": "11:12:24"}
|
| 119 |
+
{"current_steps": 595, "total_steps": 1652, "loss": 0.1982, "lr": 3.2356128352845794e-05, "epoch": 2.522292993630573, "percentage": 36.02, "elapsed_time": "6:16:43", "remaining_time": "11:09:14"}
|
| 120 |
+
{"current_steps": 600, "total_steps": 1652, "loss": 0.1869, "lr": 3.218920093529129e-05, "epoch": 2.543524416135881, "percentage": 36.32, "elapsed_time": "6:19:53", "remaining_time": "11:06:04"}
|
| 121 |
+
{"current_steps": 605, "total_steps": 1652, "loss": 0.1757, "lr": 3.202091152959126e-05, "epoch": 2.564755838641189, "percentage": 36.62, "elapsed_time": "6:24:12", "remaining_time": "11:04:53"}
|
| 122 |
+
{"current_steps": 610, "total_steps": 1652, "loss": 0.1775, "lr": 3.1851278939945974e-05, "epoch": 2.5859872611464967, "percentage": 36.92, "elapsed_time": "6:27:20", "remaining_time": "11:01:40"}
|
| 123 |
+
{"current_steps": 615, "total_steps": 1652, "loss": 0.2035, "lr": 3.1680322120639436e-05, "epoch": 2.6072186836518045, "percentage": 37.23, "elapsed_time": "6:30:29", "remaining_time": "10:58:26"}
|
| 124 |
+
{"current_steps": 620, "total_steps": 1652, "loss": 0.1815, "lr": 3.150806017392145e-05, "epoch": 2.6284501061571124, "percentage": 37.53, "elapsed_time": "6:33:39", "remaining_time": "10:55:15"}
|
| 125 |
+
{"current_steps": 625, "total_steps": 1652, "loss": 0.1946, "lr": 3.1334512347873215e-05, "epoch": 2.6496815286624202, "percentage": 37.83, "elapsed_time": "6:36:49", "remaining_time": "10:52:04"}
|
| 126 |
+
{"current_steps": 630, "total_steps": 1652, "loss": 0.1946, "lr": 3.1159698034256595e-05, "epoch": 2.670912951167728, "percentage": 38.14, "elapsed_time": "6:39:58", "remaining_time": "10:48:50"}
|
| 127 |
+
{"current_steps": 635, "total_steps": 1652, "loss": 0.2026, "lr": 3.098363676634732e-05, "epoch": 2.692144373673036, "percentage": 38.44, "elapsed_time": "6:43:06", "remaining_time": "10:45:35"}
|
| 128 |
+
{"current_steps": 640, "total_steps": 1652, "loss": 0.1906, "lr": 3.080634821675239e-05, "epoch": 2.713375796178344, "percentage": 38.74, "elapsed_time": "6:46:14", "remaining_time": "10:42:22"}
|
| 129 |
+
{"current_steps": 645, "total_steps": 1652, "loss": 0.1943, "lr": 3.0627852195211944e-05, "epoch": 2.7346072186836516, "percentage": 39.04, "elapsed_time": "6:49:24", "remaining_time": "10:39:10"}
|
| 130 |
+
{"current_steps": 650, "total_steps": 1652, "loss": 0.1871, "lr": 3.0448168646385733e-05, "epoch": 2.7558386411889595, "percentage": 39.35, "elapsed_time": "6:52:32", "remaining_time": "10:35:57"}
|
| 131 |
+
{"current_steps": 655, "total_steps": 1652, "loss": 0.2121, "lr": 3.0267317647624584e-05, "epoch": 2.777070063694268, "percentage": 39.65, "elapsed_time": "6:55:40", "remaining_time": "10:32:43"}
|
| 132 |
+
{"current_steps": 660, "total_steps": 1652, "loss": 0.2165, "lr": 3.0085319406727003e-05, "epoch": 2.798301486199575, "percentage": 39.95, "elapsed_time": "6:58:48", "remaining_time": "10:29:29"}
|
| 133 |
+
{"current_steps": 665, "total_steps": 1652, "loss": 0.1886, "lr": 2.9902194259681203e-05, "epoch": 2.8195329087048835, "percentage": 40.25, "elapsed_time": "7:01:57", "remaining_time": "10:26:17"}
|
| 134 |
+
{"current_steps": 670, "total_steps": 1652, "loss": 0.1662, "lr": 2.9717962668392837e-05, "epoch": 2.840764331210191, "percentage": 40.56, "elapsed_time": "7:05:06", "remaining_time": "10:23:04"}
|
| 135 |
+
{"current_steps": 675, "total_steps": 1652, "loss": 0.186, "lr": 2.9532645218398608e-05, "epoch": 2.861995753715499, "percentage": 40.86, "elapsed_time": "7:08:15", "remaining_time": "10:19:52"}
|
| 136 |
+
{"current_steps": 680, "total_steps": 1652, "loss": 0.1798, "lr": 2.9346262616566128e-05, "epoch": 2.8832271762208066, "percentage": 41.16, "elapsed_time": "7:11:24", "remaining_time": "10:16:39"}
|
| 137 |
+
{"current_steps": 685, "total_steps": 1652, "loss": 0.1856, "lr": 2.9158835688780188e-05, "epoch": 2.904458598726115, "percentage": 41.46, "elapsed_time": "7:14:32", "remaining_time": "10:13:26"}
|
| 138 |
+
{"current_steps": 690, "total_steps": 1652, "loss": 0.1673, "lr": 2.89703853776157e-05, "epoch": 2.9256900212314223, "percentage": 41.77, "elapsed_time": "7:17:40", "remaining_time": "10:10:12"}
|
| 139 |
+
{"current_steps": 695, "total_steps": 1652, "loss": 0.1836, "lr": 2.878093273999765e-05, "epoch": 2.9469214437367306, "percentage": 42.07, "elapsed_time": "7:20:49", "remaining_time": "10:07:00"}
|
| 140 |
+
{"current_steps": 700, "total_steps": 1652, "loss": 0.1885, "lr": 2.859049894484828e-05, "epoch": 2.968152866242038, "percentage": 42.37, "elapsed_time": "7:23:57", "remaining_time": "10:03:46"}
|
| 141 |
+
{"current_steps": 705, "total_steps": 1652, "loss": 0.2006, "lr": 2.8399105270721668e-05, "epoch": 2.9893842887473463, "percentage": 42.68, "elapsed_time": "7:27:06", "remaining_time": "10:00:34"}
|
| 142 |
+
{"current_steps": 710, "total_steps": 1652, "loss": 0.168, "lr": 2.8206773103426187e-05, "epoch": 3.008492569002123, "percentage": 42.98, "elapsed_time": "7:29:57", "remaining_time": "9:56:58"}
|
| 143 |
+
{"current_steps": 715, "total_steps": 1652, "loss": 0.2028, "lr": 2.8013523933634875e-05, "epoch": 3.029723991507431, "percentage": 43.28, "elapsed_time": "7:33:05", "remaining_time": "9:53:46"}
|
| 144 |
+
{"current_steps": 720, "total_steps": 1652, "loss": 0.1776, "lr": 2.7819379354484124e-05, "epoch": 3.050955414012739, "percentage": 43.58, "elapsed_time": "7:36:14", "remaining_time": "9:50:34"}
|
| 145 |
+
{"current_steps": 725, "total_steps": 1652, "loss": 0.1852, "lr": 2.762436105916094e-05, "epoch": 3.0721868365180467, "percentage": 43.89, "elapsed_time": "7:39:23", "remaining_time": "9:47:22"}
|
| 146 |
+
{"current_steps": 730, "total_steps": 1652, "loss": 0.2002, "lr": 2.742849083847899e-05, "epoch": 3.0934182590233545, "percentage": 44.19, "elapsed_time": "7:42:31", "remaining_time": "9:44:10"}
|
| 147 |
+
{"current_steps": 735, "total_steps": 1652, "loss": 0.1666, "lr": 2.7231790578443785e-05, "epoch": 3.1146496815286624, "percentage": 44.49, "elapsed_time": "7:45:41", "remaining_time": "9:41:00"}
|
| 148 |
+
{"current_steps": 740, "total_steps": 1652, "loss": 0.1877, "lr": 2.7034282257807136e-05, "epoch": 3.1358811040339702, "percentage": 44.79, "elapsed_time": "7:48:50", "remaining_time": "9:37:49"}
|
| 149 |
+
{"current_steps": 745, "total_steps": 1652, "loss": 0.1819, "lr": 2.683598794561138e-05, "epoch": 3.157112526539278, "percentage": 45.1, "elapsed_time": "7:51:59", "remaining_time": "9:34:37"}
|
| 150 |
+
{"current_steps": 750, "total_steps": 1652, "loss": 0.1758, "lr": 2.66369297987234e-05, "epoch": 3.178343949044586, "percentage": 45.4, "elapsed_time": "7:55:09", "remaining_time": "9:31:27"}
|
| 151 |
+
{"current_steps": 755, "total_steps": 1652, "loss": 0.1607, "lr": 2.643713005935888e-05, "epoch": 3.199575371549894, "percentage": 45.7, "elapsed_time": "7:58:18", "remaining_time": "9:28:16"}
|
| 152 |
+
{"current_steps": 760, "total_steps": 1652, "loss": 0.1732, "lr": 2.6236611052597055e-05, "epoch": 3.2208067940552016, "percentage": 46.0, "elapsed_time": "8:01:28", "remaining_time": "9:25:05"}
|
| 153 |
+
{"current_steps": 765, "total_steps": 1652, "loss": 0.1782, "lr": 2.603539518388611e-05, "epoch": 3.2420382165605095, "percentage": 46.31, "elapsed_time": "8:04:37", "remaining_time": "9:21:54"}
|
| 154 |
+
{"current_steps": 770, "total_steps": 1652, "loss": 0.1794, "lr": 2.5833504936539712e-05, "epoch": 3.2632696390658174, "percentage": 46.61, "elapsed_time": "8:07:45", "remaining_time": "9:18:42"}
|
| 155 |
+
{"current_steps": 775, "total_steps": 1652, "loss": 0.1948, "lr": 2.563096286922474e-05, "epoch": 3.284501061571125, "percentage": 46.91, "elapsed_time": "8:10:53", "remaining_time": "9:15:30"}
|
| 156 |
+
{"current_steps": 780, "total_steps": 1652, "loss": 0.1825, "lr": 2.54277916134407e-05, "epoch": 3.305732484076433, "percentage": 47.22, "elapsed_time": "8:14:04", "remaining_time": "9:12:20"}
|
| 157 |
+
{"current_steps": 785, "total_steps": 1652, "loss": 0.1861, "lr": 2.5224013870990868e-05, "epoch": 3.326963906581741, "percentage": 47.52, "elapsed_time": "8:17:14", "remaining_time": "9:09:10"}
|
| 158 |
+
{"current_steps": 790, "total_steps": 1652, "loss": 0.1929, "lr": 2.5019652411445704e-05, "epoch": 3.3481953290870488, "percentage": 47.82, "elapsed_time": "8:20:23", "remaining_time": "9:06:00"}
|
| 159 |
+
{"current_steps": 795, "total_steps": 1652, "loss": 0.1737, "lr": 2.4814730069598624e-05, "epoch": 3.3694267515923566, "percentage": 48.12, "elapsed_time": "8:23:33", "remaining_time": "9:02:49"}
|
| 160 |
+
{"current_steps": 800, "total_steps": 1652, "loss": 0.1916, "lr": 2.460926974291451e-05, "epoch": 3.3906581740976645, "percentage": 48.43, "elapsed_time": "8:26:41", "remaining_time": "8:59:37"}
|
| 161 |
+
{"current_steps": 805, "total_steps": 1652, "loss": 0.1705, "lr": 2.440329438897122e-05, "epoch": 3.4118895966029723, "percentage": 48.73, "elapsed_time": "8:31:00", "remaining_time": "8:57:40"}
|
| 162 |
+
{"current_steps": 810, "total_steps": 1652, "loss": 0.1584, "lr": 2.419682702289432e-05, "epoch": 3.43312101910828, "percentage": 49.03, "elapsed_time": "8:34:10", "remaining_time": "8:54:29"}
|
| 163 |
+
{"current_steps": 815, "total_steps": 1652, "loss": 0.1632, "lr": 2.3989890714785505e-05, "epoch": 3.454352441613588, "percentage": 49.33, "elapsed_time": "8:37:19", "remaining_time": "8:51:17"}
|
| 164 |
+
{"current_steps": 820, "total_steps": 1652, "loss": 0.1693, "lr": 2.3782508587144774e-05, "epoch": 3.475583864118896, "percentage": 49.64, "elapsed_time": "8:40:28", "remaining_time": "8:48:05"}
|
| 165 |
+
{"current_steps": 825, "total_steps": 1652, "loss": 0.1746, "lr": 2.3574703812286766e-05, "epoch": 3.4968152866242037, "percentage": 49.94, "elapsed_time": "8:43:37", "remaining_time": "8:44:53"}
|
| 166 |
+
{"current_steps": 830, "total_steps": 1652, "loss": 0.1736, "lr": 2.3366499609751593e-05, "epoch": 3.5180467091295116, "percentage": 50.24, "elapsed_time": "8:46:45", "remaining_time": "8:41:41"}
|
| 167 |
+
{"current_steps": 835, "total_steps": 1652, "loss": 0.1789, "lr": 2.3157919243710318e-05, "epoch": 3.5392781316348194, "percentage": 50.54, "elapsed_time": "8:49:54", "remaining_time": "8:38:29"}
|
| 168 |
+
{"current_steps": 840, "total_steps": 1652, "loss": 0.1955, "lr": 2.2948986020365493e-05, "epoch": 3.5605095541401273, "percentage": 50.85, "elapsed_time": "8:53:03", "remaining_time": "8:35:17"}
|
| 169 |
+
{"current_steps": 845, "total_steps": 1652, "loss": 0.2052, "lr": 2.273972328534698e-05, "epoch": 3.581740976645435, "percentage": 51.15, "elapsed_time": "8:56:12", "remaining_time": "8:32:05"}
|
| 170 |
+
{"current_steps": 850, "total_steps": 1652, "loss": 0.1627, "lr": 2.2530154421103386e-05, "epoch": 3.602972399150743, "percentage": 51.45, "elapsed_time": "8:59:21", "remaining_time": "8:28:54"}
|
| 171 |
+
{"current_steps": 855, "total_steps": 1652, "loss": 0.2028, "lr": 2.2320302844289366e-05, "epoch": 3.624203821656051, "percentage": 51.76, "elapsed_time": "9:02:29", "remaining_time": "8:25:41"}
|
| 172 |
+
{"current_steps": 860, "total_steps": 1652, "loss": 0.1643, "lr": 2.21101920031491e-05, "epoch": 3.6454352441613587, "percentage": 52.06, "elapsed_time": "9:05:38", "remaining_time": "8:22:29"}
|
| 173 |
+
{"current_steps": 865, "total_steps": 1652, "loss": 0.1724, "lr": 2.1899845374896264e-05, "epoch": 3.6666666666666665, "percentage": 52.36, "elapsed_time": "9:08:46", "remaining_time": "8:19:17"}
|
| 174 |
+
{"current_steps": 870, "total_steps": 1652, "loss": 0.1844, "lr": 2.168928646309074e-05, "epoch": 3.6878980891719744, "percentage": 52.66, "elapsed_time": "9:11:55", "remaining_time": "8:16:05"}
|
| 175 |
+
{"current_steps": 875, "total_steps": 1652, "loss": 0.2002, "lr": 2.14785387950124e-05, "epoch": 3.709129511677282, "percentage": 52.97, "elapsed_time": "9:15:03", "remaining_time": "8:12:53"}
|
| 176 |
+
{"current_steps": 880, "total_steps": 1652, "loss": 0.1949, "lr": 2.1267625919032233e-05, "epoch": 3.73036093418259, "percentage": 53.27, "elapsed_time": "9:18:11", "remaining_time": "8:09:40"}
|
| 177 |
+
{"current_steps": 885, "total_steps": 1652, "loss": 0.1882, "lr": 2.10565714019811e-05, "epoch": 3.7515923566878984, "percentage": 53.57, "elapsed_time": "9:21:19", "remaining_time": "8:06:29"}
|
| 178 |
+
{"current_steps": 890, "total_steps": 1652, "loss": 0.1844, "lr": 2.0845398826516457e-05, "epoch": 3.7728237791932058, "percentage": 53.87, "elapsed_time": "9:24:29", "remaining_time": "8:03:18"}
|
| 179 |
+
{"current_steps": 895, "total_steps": 1652, "loss": 0.187, "lr": 2.0634131788487278e-05, "epoch": 3.794055201698514, "percentage": 54.18, "elapsed_time": "9:27:37", "remaining_time": "8:00:06"}
|
| 180 |
+
{"current_steps": 900, "total_steps": 1652, "loss": 0.1743, "lr": 2.0422793894297533e-05, "epoch": 3.8152866242038215, "percentage": 54.48, "elapsed_time": "9:30:46", "remaining_time": "7:56:54"}
|
| 181 |
+
{"current_steps": 905, "total_steps": 1652, "loss": 0.1832, "lr": 2.0211408758268468e-05, "epoch": 3.8365180467091298, "percentage": 54.78, "elapsed_time": "9:33:54", "remaining_time": "7:53:42"}
|
| 182 |
+
{"current_steps": 910, "total_steps": 1652, "loss": 0.1815, "lr": 2e-05, "epoch": 3.857749469214437, "percentage": 55.08, "elapsed_time": "9:37:02", "remaining_time": "7:50:30"}
|
| 183 |
+
{"current_steps": 915, "total_steps": 1652, "loss": 0.1781, "lr": 1.9788591241731535e-05, "epoch": 3.8789808917197455, "percentage": 55.39, "elapsed_time": "9:40:10", "remaining_time": "7:47:18"}
|
| 184 |
+
{"current_steps": 920, "total_steps": 1652, "loss": 0.1811, "lr": 1.9577206105702474e-05, "epoch": 3.900212314225053, "percentage": 55.69, "elapsed_time": "9:43:19", "remaining_time": "7:44:07"}
|
| 185 |
+
{"current_steps": 925, "total_steps": 1652, "loss": 0.1958, "lr": 1.9365868211512725e-05, "epoch": 3.921443736730361, "percentage": 55.99, "elapsed_time": "9:46:27", "remaining_time": "7:40:55"}
|
| 186 |
+
{"current_steps": 930, "total_steps": 1652, "loss": 0.1675, "lr": 1.915460117348355e-05, "epoch": 3.9426751592356686, "percentage": 56.3, "elapsed_time": "9:49:36", "remaining_time": "7:37:44"}
|
| 187 |
+
{"current_steps": 935, "total_steps": 1652, "loss": 0.1729, "lr": 1.8943428598018904e-05, "epoch": 3.963906581740977, "percentage": 56.6, "elapsed_time": "9:52:45", "remaining_time": "7:34:33"}
|
| 188 |
+
{"current_steps": 940, "total_steps": 1652, "loss": 0.1664, "lr": 1.8732374080967774e-05, "epoch": 3.9851380042462843, "percentage": 56.9, "elapsed_time": "9:55:54", "remaining_time": "7:31:21"}
|
| 189 |
+
{"current_steps": 945, "total_steps": 1652, "loss": 0.1569, "lr": 1.8521461204987606e-05, "epoch": 4.004246284501062, "percentage": 57.2, "elapsed_time": "9:58:44", "remaining_time": "7:27:57"}
|
| 190 |
+
{"current_steps": 950, "total_steps": 1652, "loss": 0.1668, "lr": 1.8310713536909265e-05, "epoch": 4.025477707006369, "percentage": 57.51, "elapsed_time": "10:01:52", "remaining_time": "7:24:45"}
|
| 191 |
+
{"current_steps": 955, "total_steps": 1652, "loss": 0.1697, "lr": 1.810015462510374e-05, "epoch": 4.046709129511678, "percentage": 57.81, "elapsed_time": "10:05:01", "remaining_time": "7:21:34"}
|
| 192 |
+
{"current_steps": 960, "total_steps": 1652, "loss": 0.2036, "lr": 1.7889807996850906e-05, "epoch": 4.067940552016985, "percentage": 58.11, "elapsed_time": "10:08:10", "remaining_time": "7:18:23"}
|
| 193 |
+
{"current_steps": 965, "total_steps": 1652, "loss": 0.1686, "lr": 1.767969715571064e-05, "epoch": 4.089171974522293, "percentage": 58.41, "elapsed_time": "10:11:18", "remaining_time": "7:15:12"}
|
| 194 |
+
{"current_steps": 970, "total_steps": 1652, "loss": 0.1837, "lr": 1.746984557889662e-05, "epoch": 4.110403397027601, "percentage": 58.72, "elapsed_time": "10:14:26", "remaining_time": "7:12:00"}
|
| 195 |
+
{"current_steps": 975, "total_steps": 1652, "loss": 0.1704, "lr": 1.7260276714653023e-05, "epoch": 4.131634819532909, "percentage": 59.02, "elapsed_time": "10:17:35", "remaining_time": "7:08:49"}
|
| 196 |
+
{"current_steps": 980, "total_steps": 1652, "loss": 0.1607, "lr": 1.7051013979634514e-05, "epoch": 4.1528662420382165, "percentage": 59.32, "elapsed_time": "10:20:44", "remaining_time": "7:05:39"}
|
| 197 |
+
{"current_steps": 985, "total_steps": 1652, "loss": 0.1826, "lr": 1.684208075628969e-05, "epoch": 4.174097664543525, "percentage": 59.62, "elapsed_time": "10:23:52", "remaining_time": "7:02:27"}
|
| 198 |
+
{"current_steps": 990, "total_steps": 1652, "loss": 0.1573, "lr": 1.6633500390248414e-05, "epoch": 4.195329087048832, "percentage": 59.93, "elapsed_time": "10:27:00", "remaining_time": "6:59:16"}
|
| 199 |
+
{"current_steps": 995, "total_steps": 1652, "loss": 0.1518, "lr": 1.642529618771324e-05, "epoch": 4.2165605095541405, "percentage": 60.23, "elapsed_time": "10:30:09", "remaining_time": "6:56:05"}
|
| 200 |
+
{"current_steps": 1000, "total_steps": 1652, "loss": 0.1813, "lr": 1.6217491412855233e-05, "epoch": 4.237791932059448, "percentage": 60.53, "elapsed_time": "10:33:17", "remaining_time": "6:52:54"}
|
| 201 |
+
{"current_steps": 1005, "total_steps": 1652, "loss": 0.1753, "lr": 1.60101092852145e-05, "epoch": 4.259023354564756, "percentage": 60.84, "elapsed_time": "10:37:33", "remaining_time": "6:50:27"}
|
| 202 |
+
{"current_steps": 1010, "total_steps": 1652, "loss": 0.1579, "lr": 1.5803172977105686e-05, "epoch": 4.280254777070064, "percentage": 61.14, "elapsed_time": "10:40:43", "remaining_time": "6:47:16"}
|
| 203 |
+
{"current_steps": 1015, "total_steps": 1652, "loss": 0.1607, "lr": 1.5596705611028792e-05, "epoch": 4.301486199575372, "percentage": 61.44, "elapsed_time": "10:43:54", "remaining_time": "6:44:06"}
|
| 204 |
+
{"current_steps": 1020, "total_steps": 1652, "loss": 0.162, "lr": 1.5390730257085494e-05, "epoch": 4.322717622080679, "percentage": 61.74, "elapsed_time": "10:47:03", "remaining_time": "6:40:55"}
|
| 205 |
+
{"current_steps": 1025, "total_steps": 1652, "loss": 0.1734, "lr": 1.5185269930401381e-05, "epoch": 4.343949044585988, "percentage": 62.05, "elapsed_time": "10:50:13", "remaining_time": "6:37:44"}
|
| 206 |
+
{"current_steps": 1030, "total_steps": 1652, "loss": 0.1632, "lr": 1.4980347588554302e-05, "epoch": 4.365180467091295, "percentage": 62.35, "elapsed_time": "10:53:23", "remaining_time": "6:34:34"}
|
| 207 |
+
{"current_steps": 1035, "total_steps": 1652, "loss": 0.1897, "lr": 1.4775986129009137e-05, "epoch": 4.386411889596603, "percentage": 62.65, "elapsed_time": "10:56:32", "remaining_time": "6:31:23"}
|
| 208 |
+
{"current_steps": 1040, "total_steps": 1652, "loss": 0.1672, "lr": 1.4572208386559304e-05, "epoch": 4.407643312101911, "percentage": 62.95, "elapsed_time": "10:59:42", "remaining_time": "6:28:12"}
|
| 209 |
+
{"current_steps": 1045, "total_steps": 1652, "loss": 0.1598, "lr": 1.436903713077526e-05, "epoch": 4.428874734607219, "percentage": 63.26, "elapsed_time": "11:02:52", "remaining_time": "6:25:02"}
|
| 210 |
+
{"current_steps": 1050, "total_steps": 1652, "loss": 0.1619, "lr": 1.4166495063460295e-05, "epoch": 4.450106157112526, "percentage": 63.56, "elapsed_time": "11:06:01", "remaining_time": "6:21:51"}
|
| 211 |
+
{"current_steps": 1055, "total_steps": 1652, "loss": 0.1806, "lr": 1.3964604816113896e-05, "epoch": 4.471337579617835, "percentage": 63.86, "elapsed_time": "11:09:09", "remaining_time": "6:18:39"}
|
| 212 |
+
{"current_steps": 1060, "total_steps": 1652, "loss": 0.1715, "lr": 1.3763388947402953e-05, "epoch": 4.492569002123142, "percentage": 64.16, "elapsed_time": "11:12:17", "remaining_time": "6:15:28"}
|
| 213 |
+
{"current_steps": 1065, "total_steps": 1652, "loss": 0.1691, "lr": 1.3562869940641123e-05, "epoch": 4.51380042462845, "percentage": 64.47, "elapsed_time": "11:15:26", "remaining_time": "6:12:17"}
|
| 214 |
+
{"current_steps": 1070, "total_steps": 1652, "loss": 0.1601, "lr": 1.3363070201276606e-05, "epoch": 4.535031847133758, "percentage": 64.77, "elapsed_time": "11:18:35", "remaining_time": "6:09:06"}
|
| 215 |
+
{"current_steps": 1075, "total_steps": 1652, "loss": 0.1715, "lr": 1.316401205438862e-05, "epoch": 4.556263269639066, "percentage": 65.07, "elapsed_time": "11:21:44", "remaining_time": "6:05:55"}
|
| 216 |
+
{"current_steps": 1080, "total_steps": 1652, "loss": 0.1734, "lr": 1.2965717742192866e-05, "epoch": 4.5774946921443735, "percentage": 65.38, "elapsed_time": "11:24:53", "remaining_time": "6:02:44"}
|
| 217 |
+
{"current_steps": 1085, "total_steps": 1652, "loss": 0.1702, "lr": 1.276820942155622e-05, "epoch": 4.598726114649682, "percentage": 65.68, "elapsed_time": "11:28:02", "remaining_time": "5:59:33"}
|
| 218 |
+
{"current_steps": 1090, "total_steps": 1652, "loss": 0.1714, "lr": 1.2571509161521007e-05, "epoch": 4.619957537154989, "percentage": 65.98, "elapsed_time": "11:31:12", "remaining_time": "5:56:22"}
|
| 219 |
+
{"current_steps": 1095, "total_steps": 1652, "loss": 0.1954, "lr": 1.2375638940839062e-05, "epoch": 4.6411889596602975, "percentage": 66.28, "elapsed_time": "11:34:20", "remaining_time": "5:53:11"}
|
| 220 |
+
{"current_steps": 1100, "total_steps": 1652, "loss": 0.1619, "lr": 1.2180620645515875e-05, "epoch": 4.662420382165605, "percentage": 66.59, "elapsed_time": "11:37:30", "remaining_time": "5:50:01"}
|
| 221 |
+
{"current_steps": 1105, "total_steps": 1652, "loss": 0.1794, "lr": 1.1986476066365125e-05, "epoch": 4.683651804670913, "percentage": 66.89, "elapsed_time": "11:40:39", "remaining_time": "5:46:50"}
|
| 222 |
+
{"current_steps": 1110, "total_steps": 1652, "loss": 0.1964, "lr": 1.179322689657381e-05, "epoch": 4.704883227176221, "percentage": 67.19, "elapsed_time": "11:43:47", "remaining_time": "5:43:39"}
|
| 223 |
+
{"current_steps": 1115, "total_steps": 1652, "loss": 0.1633, "lr": 1.1600894729278333e-05, "epoch": 4.726114649681529, "percentage": 67.49, "elapsed_time": "11:46:55", "remaining_time": "5:40:28"}
|
| 224 |
+
{"current_steps": 1120, "total_steps": 1652, "loss": 0.1663, "lr": 1.1409501055151726e-05, "epoch": 4.747346072186836, "percentage": 67.8, "elapsed_time": "11:50:04", "remaining_time": "5:37:17"}
|
| 225 |
+
{"current_steps": 1125, "total_steps": 1652, "loss": 0.1481, "lr": 1.1219067260002352e-05, "epoch": 4.768577494692145, "percentage": 68.1, "elapsed_time": "11:53:13", "remaining_time": "5:34:06"}
|
| 226 |
+
{"current_steps": 1130, "total_steps": 1652, "loss": 0.1763, "lr": 1.1029614622384307e-05, "epoch": 4.789808917197452, "percentage": 68.4, "elapsed_time": "11:56:22", "remaining_time": "5:30:55"}
|
| 227 |
+
{"current_steps": 1135, "total_steps": 1652, "loss": 0.1665, "lr": 1.0841164311219812e-05, "epoch": 4.81104033970276, "percentage": 68.7, "elapsed_time": "11:59:30", "remaining_time": "5:27:44"}
|
| 228 |
+
{"current_steps": 1140, "total_steps": 1652, "loss": 0.1727, "lr": 1.0653737383433869e-05, "epoch": 4.832271762208068, "percentage": 69.01, "elapsed_time": "12:02:39", "remaining_time": "5:24:33"}
|
| 229 |
+
{"current_steps": 1145, "total_steps": 1652, "loss": 0.1664, "lr": 1.0467354781601395e-05, "epoch": 4.853503184713376, "percentage": 69.31, "elapsed_time": "12:05:47", "remaining_time": "5:21:22"}
|
| 230 |
+
{"current_steps": 1150, "total_steps": 1652, "loss": 0.165, "lr": 1.0282037331607167e-05, "epoch": 4.8747346072186835, "percentage": 69.61, "elapsed_time": "12:08:56", "remaining_time": "5:18:11"}
|
| 231 |
+
{"current_steps": 1155, "total_steps": 1652, "loss": 0.1613, "lr": 1.0097805740318797e-05, "epoch": 4.895966029723992, "percentage": 69.92, "elapsed_time": "12:12:05", "remaining_time": "5:15:01"}
|
| 232 |
+
{"current_steps": 1160, "total_steps": 1652, "loss": 0.1855, "lr": 9.914680593273e-06, "epoch": 4.917197452229299, "percentage": 70.22, "elapsed_time": "12:15:13", "remaining_time": "5:11:50"}
|
| 233 |
+
{"current_steps": 1165, "total_steps": 1652, "loss": 0.1692, "lr": 9.732682352375418e-06, "epoch": 4.9384288747346075, "percentage": 70.52, "elapsed_time": "12:18:22", "remaining_time": "5:08:39"}
|
| 234 |
+
{"current_steps": 1170, "total_steps": 1652, "loss": 0.1666, "lr": 9.551831353614272e-06, "epoch": 4.959660297239915, "percentage": 70.82, "elapsed_time": "12:21:31", "remaining_time": "5:05:29"}
|
| 235 |
+
{"current_steps": 1175, "total_steps": 1652, "loss": 0.1838, "lr": 9.372147804788063e-06, "epoch": 4.980891719745223, "percentage": 71.13, "elapsed_time": "12:24:40", "remaining_time": "5:02:18"}
|
| 236 |
+
{"current_steps": 1180, "total_steps": 1652, "loss": 0.1652, "lr": 9.193651783247616e-06, "epoch": 5.0, "percentage": 71.43, "elapsed_time": "12:27:29", "remaining_time": "4:58:59"}
|
| 237 |
+
{"current_steps": 1185, "total_steps": 1652, "loss": 0.1657, "lr": 9.016363233652686e-06, "epoch": 5.021231422505308, "percentage": 71.73, "elapsed_time": "12:30:38", "remaining_time": "4:55:49"}
|
| 238 |
+
{"current_steps": 1190, "total_steps": 1652, "loss": 0.1813, "lr": 8.840301965743405e-06, "epoch": 5.042462845010616, "percentage": 72.03, "elapsed_time": "12:33:47", "remaining_time": "4:52:38"}
|
| 239 |
+
{"current_steps": 1195, "total_steps": 1652, "loss": 0.1678, "lr": 8.665487652126785e-06, "epoch": 5.063694267515924, "percentage": 72.34, "elapsed_time": "12:36:56", "remaining_time": "4:49:28"}
|
| 240 |
+
{"current_steps": 1200, "total_steps": 1652, "loss": 0.1691, "lr": 8.491939826078552e-06, "epoch": 5.084925690021231, "percentage": 72.64, "elapsed_time": "12:40:05", "remaining_time": "4:46:18"}
|
| 241 |
+
{"current_steps": 1205, "total_steps": 1652, "loss": 0.1662, "lr": 8.319677879360566e-06, "epoch": 5.10615711252654, "percentage": 72.94, "elapsed_time": "12:44:22", "remaining_time": "4:43:32"}
|
| 242 |
+
{"current_steps": 1210, "total_steps": 1652, "loss": 0.1576, "lr": 8.148721060054026e-06, "epoch": 5.127388535031847, "percentage": 73.24, "elapsed_time": "12:47:31", "remaining_time": "4:40:22"}
|
| 243 |
+
{"current_steps": 1215, "total_steps": 1652, "loss": 0.1633, "lr": 7.979088470408743e-06, "epoch": 5.148619957537155, "percentage": 73.55, "elapsed_time": "12:50:39", "remaining_time": "4:37:11"}
|
| 244 |
+
{"current_steps": 1220, "total_steps": 1652, "loss": 0.1707, "lr": 7.81079906470872e-06, "epoch": 5.169851380042463, "percentage": 73.85, "elapsed_time": "12:53:48", "remaining_time": "4:34:00"}
|
| 245 |
+
{"current_steps": 1225, "total_steps": 1652, "loss": 0.1675, "lr": 7.643871647154212e-06, "epoch": 5.191082802547771, "percentage": 74.15, "elapsed_time": "12:56:57", "remaining_time": "4:30:49"}
|
| 246 |
+
{"current_steps": 1230, "total_steps": 1652, "loss": 0.1598, "lr": 7.478324869760665e-06, "epoch": 5.2123142250530785, "percentage": 74.46, "elapsed_time": "13:00:06", "remaining_time": "4:27:38"}
|
| 247 |
+
{"current_steps": 1235, "total_steps": 1652, "loss": 0.1498, "lr": 7.314177230274522e-06, "epoch": 5.233545647558387, "percentage": 74.76, "elapsed_time": "13:03:14", "remaining_time": "4:24:27"}
|
| 248 |
+
{"current_steps": 1240, "total_steps": 1652, "loss": 0.1557, "lr": 7.151447070106372e-06, "epoch": 5.254777070063694, "percentage": 75.06, "elapsed_time": "13:06:23", "remaining_time": "4:21:17"}
|
| 249 |
+
{"current_steps": 1245, "total_steps": 1652, "loss": 0.1682, "lr": 6.990152572281523e-06, "epoch": 5.2760084925690025, "percentage": 75.36, "elapsed_time": "13:09:32", "remaining_time": "4:18:06"}
|
| 250 |
+
{"current_steps": 1250, "total_steps": 1652, "loss": 0.1478, "lr": 6.830311759408275e-06, "epoch": 5.29723991507431, "percentage": 75.67, "elapsed_time": "13:12:40", "remaining_time": "4:14:55"}
|
| 251 |
+
{"current_steps": 1255, "total_steps": 1652, "loss": 0.1583, "lr": 6.671942491664128e-06, "epoch": 5.318471337579618, "percentage": 75.97, "elapsed_time": "13:15:50", "remaining_time": "4:11:45"}
|
| 252 |
+
{"current_steps": 1260, "total_steps": 1652, "loss": 0.1617, "lr": 6.515062464800139e-06, "epoch": 5.339702760084926, "percentage": 76.27, "elapsed_time": "13:18:59", "remaining_time": "4:08:34"}
|
| 253 |
+
{"current_steps": 1265, "total_steps": 1652, "loss": 0.1786, "lr": 6.359689208163635e-06, "epoch": 5.360934182590234, "percentage": 76.57, "elapsed_time": "13:22:06", "remaining_time": "4:05:23"}
|
| 254 |
+
{"current_steps": 1270, "total_steps": 1652, "loss": 0.1704, "lr": 6.205840082739538e-06, "epoch": 5.382165605095541, "percentage": 76.88, "elapsed_time": "13:25:15", "remaining_time": "4:02:12"}
|
| 255 |
+
{"current_steps": 1275, "total_steps": 1652, "loss": 0.1819, "lr": 6.053532279210494e-06, "epoch": 5.40339702760085, "percentage": 77.18, "elapsed_time": "13:28:23", "remaining_time": "3:59:01"}
|
| 256 |
+
{"current_steps": 1280, "total_steps": 1652, "loss": 0.1516, "lr": 5.90278281603605e-06, "epoch": 5.424628450106157, "percentage": 77.48, "elapsed_time": "13:31:32", "remaining_time": "3:55:51"}
|
| 257 |
+
{"current_steps": 1285, "total_steps": 1652, "loss": 0.1751, "lr": 5.753608537551023e-06, "epoch": 5.445859872611465, "percentage": 77.78, "elapsed_time": "13:34:41", "remaining_time": "3:52:40"}
|
| 258 |
+
{"current_steps": 1290, "total_steps": 1652, "loss": 0.172, "lr": 5.606026112083383e-06, "epoch": 5.467091295116773, "percentage": 78.09, "elapsed_time": "13:37:51", "remaining_time": "3:49:30"}
|
| 259 |
+
{"current_steps": 1295, "total_steps": 1652, "loss": 0.1669, "lr": 5.460052030091782e-06, "epoch": 5.488322717622081, "percentage": 78.39, "elapsed_time": "13:41:01", "remaining_time": "3:46:20"}
|
| 260 |
+
{"current_steps": 1300, "total_steps": 1652, "loss": 0.159, "lr": 5.315702602322943e-06, "epoch": 5.509554140127388, "percentage": 78.69, "elapsed_time": "13:44:10", "remaining_time": "3:43:09"}
|
| 261 |
+
{"current_steps": 1305, "total_steps": 1652, "loss": 0.162, "lr": 5.1729939579891476e-06, "epoch": 5.530785562632697, "percentage": 79.0, "elapsed_time": "13:47:20", "remaining_time": "3:39:59"}
|
| 262 |
+
{"current_steps": 1310, "total_steps": 1652, "loss": 0.1647, "lr": 5.031942042966e-06, "epoch": 5.552016985138004, "percentage": 79.3, "elapsed_time": "13:50:29", "remaining_time": "3:36:48"}
|
| 263 |
+
{"current_steps": 1315, "total_steps": 1652, "loss": 0.1361, "lr": 4.892562618010684e-06, "epoch": 5.573248407643312, "percentage": 79.6, "elapsed_time": "13:53:38", "remaining_time": "3:33:38"}
|
| 264 |
+
{"current_steps": 1320, "total_steps": 1652, "loss": 0.1583, "lr": 4.754871257000888e-06, "epoch": 5.59447983014862, "percentage": 79.9, "elapsed_time": "13:56:46", "remaining_time": "3:30:27"}
|
| 265 |
+
{"current_steps": 1325, "total_steps": 1652, "loss": 0.1579, "lr": 4.618883345194627e-06, "epoch": 5.615711252653928, "percentage": 80.21, "elapsed_time": "13:59:55", "remaining_time": "3:27:17"}
|
| 266 |
+
{"current_steps": 1330, "total_steps": 1652, "loss": 0.1633, "lr": 4.484614077511153e-06, "epoch": 5.6369426751592355, "percentage": 80.51, "elapsed_time": "14:03:03", "remaining_time": "3:24:06"}
|
| 267 |
+
{"current_steps": 1335, "total_steps": 1652, "loss": 0.1719, "lr": 4.352078456833082e-06, "epoch": 5.658174097664544, "percentage": 80.81, "elapsed_time": "14:06:12", "remaining_time": "3:20:56"}
|
| 268 |
+
{"current_steps": 1340, "total_steps": 1652, "loss": 0.1775, "lr": 4.221291292330036e-06, "epoch": 5.679405520169851, "percentage": 81.11, "elapsed_time": "14:09:21", "remaining_time": "3:17:45"}
|
| 269 |
+
{"current_steps": 1345, "total_steps": 1652, "loss": 0.1676, "lr": 4.0922671978039055e-06, "epoch": 5.7006369426751595, "percentage": 81.42, "elapsed_time": "14:12:30", "remaining_time": "3:14:35"}
|
| 270 |
+
{"current_steps": 1350, "total_steps": 1652, "loss": 0.1975, "lr": 3.965020590055934e-06, "epoch": 5.721868365180467, "percentage": 81.72, "elapsed_time": "14:15:39", "remaining_time": "3:11:24"}
|
| 271 |
+
{"current_steps": 1355, "total_steps": 1652, "loss": 0.1555, "lr": 3.839565687275835e-06, "epoch": 5.743099787685775, "percentage": 82.02, "elapsed_time": "14:18:48", "remaining_time": "3:08:14"}
|
| 272 |
+
{"current_steps": 1360, "total_steps": 1652, "loss": 0.1423, "lr": 3.715916507453079e-06, "epoch": 5.764331210191083, "percentage": 82.32, "elapsed_time": "14:21:57", "remaining_time": "3:05:04"}
|
| 273 |
+
{"current_steps": 1365, "total_steps": 1652, "loss": 0.1408, "lr": 3.5940868668105644e-06, "epoch": 5.785562632696391, "percentage": 82.63, "elapsed_time": "14:25:07", "remaining_time": "3:01:53"}
|
| 274 |
+
{"current_steps": 1370, "total_steps": 1652, "loss": 0.1697, "lr": 3.4740903782608416e-06, "epoch": 5.806794055201698, "percentage": 82.93, "elapsed_time": "14:28:16", "remaining_time": "2:58:43"}
|
| 275 |
+
{"current_steps": 1375, "total_steps": 1652, "loss": 0.1836, "lr": 3.3559404498850245e-06, "epoch": 5.828025477707007, "percentage": 83.23, "elapsed_time": "14:31:24", "remaining_time": "2:55:32"}
|
| 276 |
+
{"current_steps": 1380, "total_steps": 1652, "loss": 0.158, "lr": 3.2396502834346277e-06, "epoch": 5.849256900212314, "percentage": 83.54, "elapsed_time": "14:34:32", "remaining_time": "2:52:22"}
|
| 277 |
+
{"current_steps": 1385, "total_steps": 1652, "loss": 0.1609, "lr": 3.1252328728564206e-06, "epoch": 5.870488322717622, "percentage": 83.84, "elapsed_time": "14:37:41", "remaining_time": "2:49:12"}
|
| 278 |
+
{"current_steps": 1390, "total_steps": 1652, "loss": 0.1604, "lr": 3.0127010028405303e-06, "epoch": 5.89171974522293, "percentage": 84.14, "elapsed_time": "14:40:49", "remaining_time": "2:46:01"}
|
| 279 |
+
{"current_steps": 1395, "total_steps": 1652, "loss": 0.1683, "lr": 2.9020672473919107e-06, "epoch": 5.912951167728238, "percentage": 84.44, "elapsed_time": "14:43:58", "remaining_time": "2:42:51"}
|
| 280 |
+
{"current_steps": 1400, "total_steps": 1652, "loss": 0.1706, "lr": 2.7933439684253616e-06, "epoch": 5.934182590233545, "percentage": 84.75, "elapsed_time": "14:47:07", "remaining_time": "2:39:41"}
|
| 281 |
+
{"current_steps": 1405, "total_steps": 1652, "loss": 0.1693, "lr": 2.6865433143842356e-06, "epoch": 5.955414012738854, "percentage": 85.05, "elapsed_time": "14:51:27", "remaining_time": "2:36:43"}
|
| 282 |
+
{"current_steps": 1410, "total_steps": 1652, "loss": 0.165, "lr": 2.5816772188830098e-06, "epoch": 5.976645435244161, "percentage": 85.35, "elapsed_time": "14:54:36", "remaining_time": "2:33:32"}
|
| 283 |
+
{"current_steps": 1415, "total_steps": 1652, "loss": 0.1631, "lr": 2.4787573993738524e-06, "epoch": 5.997876857749469, "percentage": 85.65, "elapsed_time": "14:57:46", "remaining_time": "2:30:22"}
|
| 284 |
+
{"current_steps": 1420, "total_steps": 1652, "loss": 0.1523, "lr": 2.377795355837349e-06, "epoch": 6.016985138004246, "percentage": 85.96, "elapsed_time": "15:00:37", "remaining_time": "2:27:08"}
|
| 285 |
+
{"current_steps": 1425, "total_steps": 1652, "loss": 0.1663, "lr": 2.2788023694975236e-06, "epoch": 6.038216560509555, "percentage": 86.26, "elapsed_time": "15:03:44", "remaining_time": "2:23:57"}
|
| 286 |
+
{"current_steps": 1430, "total_steps": 1652, "loss": 0.1575, "lr": 2.1817895015613134e-06, "epoch": 6.059447983014862, "percentage": 86.56, "elapsed_time": "15:06:53", "remaining_time": "2:20:47"}
|
| 287 |
+
{"current_steps": 1435, "total_steps": 1652, "loss": 0.1529, "lr": 2.086767591982608e-06, "epoch": 6.08067940552017, "percentage": 86.86, "elapsed_time": "15:10:02", "remaining_time": "2:17:36"}
|
| 288 |
+
{"current_steps": 1440, "total_steps": 1652, "loss": 0.1684, "lr": 1.9937472582510243e-06, "epoch": 6.101910828025478, "percentage": 87.17, "elapsed_time": "15:13:09", "remaining_time": "2:14:26"}
|
| 289 |
+
{"current_steps": 1445, "total_steps": 1652, "loss": 0.1554, "lr": 1.902738894205547e-06, "epoch": 6.123142250530786, "percentage": 87.47, "elapsed_time": "15:16:18", "remaining_time": "2:11:15"}
|
| 290 |
+
{"current_steps": 1450, "total_steps": 1652, "loss": 0.1596, "lr": 1.8137526688731365e-06, "epoch": 6.144373673036093, "percentage": 87.77, "elapsed_time": "15:19:27", "remaining_time": "2:08:05"}
|
| 291 |
+
{"current_steps": 1455, "total_steps": 1652, "loss": 0.1534, "lr": 1.7267985253324803e-06, "epoch": 6.165605095541402, "percentage": 88.08, "elapsed_time": "15:22:35", "remaining_time": "2:04:54"}
|
| 292 |
+
{"current_steps": 1460, "total_steps": 1652, "loss": 0.1738, "lr": 1.641886179602974e-06, "epoch": 6.186836518046709, "percentage": 88.38, "elapsed_time": "15:25:44", "remaining_time": "2:01:44"}
|
| 293 |
+
{"current_steps": 1465, "total_steps": 1652, "loss": 0.1723, "lr": 1.5590251195590811e-06, "epoch": 6.208067940552017, "percentage": 88.68, "elapsed_time": "15:28:53", "remaining_time": "1:58:34"}
|
| 294 |
+
{"current_steps": 1470, "total_steps": 1652, "loss": 0.1708, "lr": 1.4782246038701865e-06, "epoch": 6.229299363057325, "percentage": 88.98, "elapsed_time": "15:32:02", "remaining_time": "1:55:23"}
|
| 295 |
+
{"current_steps": 1475, "total_steps": 1652, "loss": 0.1735, "lr": 1.3994936609660493e-06, "epoch": 6.250530785562633, "percentage": 89.29, "elapsed_time": "15:35:11", "remaining_time": "1:52:13"}
|
| 296 |
+
{"current_steps": 1480, "total_steps": 1652, "loss": 0.1719, "lr": 1.3228410880280084e-06, "epoch": 6.2717622080679405, "percentage": 89.59, "elapsed_time": "15:38:20", "remaining_time": "1:49:03"}
|
| 297 |
+
{"current_steps": 1485, "total_steps": 1652, "loss": 0.158, "lr": 1.248275450005987e-06, "epoch": 6.292993630573249, "percentage": 89.89, "elapsed_time": "15:41:29", "remaining_time": "1:45:52"}
|
| 298 |
+
{"current_steps": 1490, "total_steps": 1652, "loss": 0.1674, "lr": 1.1758050786614872e-06, "epoch": 6.314225053078556, "percentage": 90.19, "elapsed_time": "15:44:37", "remaining_time": "1:42:42"}
|
| 299 |
+
{"current_steps": 1495, "total_steps": 1652, "loss": 0.1698, "lr": 1.1054380716366064e-06, "epoch": 6.3354564755838645, "percentage": 90.5, "elapsed_time": "15:47:46", "remaining_time": "1:39:31"}
|
| 300 |
+
{"current_steps": 1500, "total_steps": 1652, "loss": 0.1568, "lr": 1.0371822915492414e-06, "epoch": 6.356687898089172, "percentage": 90.8, "elapsed_time": "15:50:55", "remaining_time": "1:36:21"}
|
| 301 |
+
{"current_steps": 1505, "total_steps": 1652, "loss": 0.1634, "lr": 9.710453651145335e-07, "epoch": 6.37791932059448, "percentage": 91.1, "elapsed_time": "15:54:05", "remaining_time": "1:33:11"}
|
| 302 |
+
{"current_steps": 1510, "total_steps": 1652, "loss": 0.1658, "lr": 9.070346822926846e-07, "epoch": 6.399150743099788, "percentage": 91.4, "elapsed_time": "15:57:15", "remaining_time": "1:30:01"}
|
| 303 |
+
{"current_steps": 1515, "total_steps": 1652, "loss": 0.1619, "lr": 8.451573954632186e-07, "epoch": 6.420382165605096, "percentage": 91.71, "elapsed_time": "16:00:24", "remaining_time": "1:26:50"}
|
| 304 |
+
{"current_steps": 1520, "total_steps": 1652, "loss": 0.1543, "lr": 7.854204186257952e-07, "epoch": 6.441613588110403, "percentage": 92.01, "elapsed_time": "16:03:33", "remaining_time": "1:23:40"}
|
| 305 |
+
{"current_steps": 1525, "total_steps": 1652, "loss": 0.1555, "lr": 7.278304266276625e-07, "epoch": 6.462845010615712, "percentage": 92.31, "elapsed_time": "16:06:41", "remaining_time": "1:20:30"}
|
| 306 |
+
{"current_steps": 1530, "total_steps": 1652, "loss": 0.1524, "lr": 6.723938544178232e-07, "epoch": 6.484076433121019, "percentage": 92.62, "elapsed_time": "16:09:50", "remaining_time": "1:17:19"}
|
| 307 |
+
{"current_steps": 1535, "total_steps": 1652, "loss": 0.1545, "lr": 6.191168963280136e-07, "epoch": 6.505307855626327, "percentage": 92.92, "elapsed_time": "16:12:58", "remaining_time": "1:14:09"}
|
| 308 |
+
{"current_steps": 1540, "total_steps": 1652, "loss": 0.1439, "lr": 5.680055053805622e-07, "epoch": 6.526539278131635, "percentage": 93.22, "elapsed_time": "16:16:08", "remaining_time": "1:10:59"}
|
| 309 |
+
{"current_steps": 1545, "total_steps": 1652, "loss": 0.1787, "lr": 5.190653926232169e-07, "epoch": 6.547770700636943, "percentage": 93.52, "elapsed_time": "16:19:16", "remaining_time": "1:07:49"}
|
| 310 |
+
{"current_steps": 1550, "total_steps": 1652, "loss": 0.1493, "lr": 4.723020264910139e-07, "epoch": 6.56900212314225, "percentage": 93.83, "elapsed_time": "16:22:25", "remaining_time": "1:04:38"}
|
| 311 |
+
{"current_steps": 1555, "total_steps": 1652, "loss": 0.1871, "lr": 4.2772063219523875e-07, "epoch": 6.590233545647559, "percentage": 94.13, "elapsed_time": "16:25:33", "remaining_time": "1:01:28"}
|
| 312 |
+
{"current_steps": 1560, "total_steps": 1652, "loss": 0.162, "lr": 3.853261911395834e-07, "epoch": 6.611464968152866, "percentage": 94.43, "elapsed_time": "16:28:41", "remaining_time": "0:58:18"}
|
| 313 |
+
{"current_steps": 1565, "total_steps": 1652, "loss": 0.1771, "lr": 3.4512344036353727e-07, "epoch": 6.632696390658174, "percentage": 94.73, "elapsed_time": "16:31:50", "remaining_time": "0:55:08"}
|
| 314 |
+
{"current_steps": 1570, "total_steps": 1652, "loss": 0.1496, "lr": 3.071168720130779e-07, "epoch": 6.653927813163482, "percentage": 95.04, "elapsed_time": "16:34:58", "remaining_time": "0:51:58"}
|
| 315 |
+
{"current_steps": 1575, "total_steps": 1652, "loss": 0.1573, "lr": 2.7131073283873654e-07, "epoch": 6.67515923566879, "percentage": 95.34, "elapsed_time": "16:38:07", "remaining_time": "0:48:47"}
|
| 316 |
+
{"current_steps": 1580, "total_steps": 1652, "loss": 0.1609, "lr": 2.3770902372107772e-07, "epoch": 6.6963906581740975, "percentage": 95.64, "elapsed_time": "16:41:16", "remaining_time": "0:45:37"}
|
| 317 |
+
{"current_steps": 1585, "total_steps": 1652, "loss": 0.1427, "lr": 2.0631549922364824e-07, "epoch": 6.717622080679406, "percentage": 95.94, "elapsed_time": "16:44:25", "remaining_time": "0:42:27"}
|
| 318 |
+
{"current_steps": 1590, "total_steps": 1652, "loss": 0.1706, "lr": 1.7713366717344803e-07, "epoch": 6.738853503184713, "percentage": 96.25, "elapsed_time": "16:47:33", "remaining_time": "0:39:17"}
|
| 319 |
+
{"current_steps": 1595, "total_steps": 1652, "loss": 0.1495, "lr": 1.5016678826899055e-07, "epoch": 6.7600849256900215, "percentage": 96.55, "elapsed_time": "16:50:42", "remaining_time": "0:36:07"}
|
| 320 |
+
{"current_steps": 1600, "total_steps": 1652, "loss": 0.1599, "lr": 1.2541787571594522e-07, "epoch": 6.781316348195329, "percentage": 96.85, "elapsed_time": "16:53:51", "remaining_time": "0:32:57"}
|
| 321 |
+
{"current_steps": 1605, "total_steps": 1652, "loss": 0.1417, "lr": 1.0288969489046008e-07, "epoch": 6.802547770700637, "percentage": 97.15, "elapsed_time": "16:58:16", "remaining_time": "0:29:49"}
|
| 322 |
+
{"current_steps": 1610, "total_steps": 1652, "loss": 0.148, "lr": 8.258476303016017e-08, "epoch": 6.823779193205945, "percentage": 97.46, "elapsed_time": "17:01:27", "remaining_time": "0:26:38"}
|
| 323 |
+
{"current_steps": 1615, "total_steps": 1652, "loss": 0.1707, "lr": 6.45053489528813e-08, "epoch": 6.845010615711253, "percentage": 97.76, "elapsed_time": "17:04:36", "remaining_time": "0:23:28"}
|
| 324 |
+
{"current_steps": 1620, "total_steps": 1652, "loss": 0.1737, "lr": 4.8653472803159576e-08, "epoch": 6.86624203821656, "percentage": 98.06, "elapsed_time": "17:07:44", "remaining_time": "0:20:18"}
|
| 325 |
+
{"current_steps": 1625, "total_steps": 1652, "loss": 0.1656, "lr": 3.503090582650081e-08, "epoch": 6.887473460721869, "percentage": 98.37, "elapsed_time": "17:10:53", "remaining_time": "0:17:07"}
|
| 326 |
+
{"current_steps": 1630, "total_steps": 1652, "loss": 0.1578, "lr": 2.3639170171474434e-08, "epoch": 6.908704883227176, "percentage": 98.67, "elapsed_time": "17:14:01", "remaining_time": "0:13:57"}
|
| 327 |
+
{"current_steps": 1635, "total_steps": 1652, "loss": 0.1607, "lr": 1.4479538719622822e-08, "epoch": 6.929936305732484, "percentage": 98.97, "elapsed_time": "17:17:10", "remaining_time": "0:10:47"}
|
| 328 |
+
{"current_steps": 1640, "total_steps": 1652, "loss": 0.174, "lr": 7.553034943243998e-09, "epoch": 6.951167728237792, "percentage": 99.27, "elapsed_time": "17:20:19", "remaining_time": "0:07:36"}
|
| 329 |
+
{"current_steps": 1645, "total_steps": 1652, "loss": 0.1679, "lr": 2.8604327910186634e-09, "epoch": 6.9723991507431, "percentage": 99.58, "elapsed_time": "17:23:27", "remaining_time": "0:04:26"}
|
| 330 |
+
{"current_steps": 1650, "total_steps": 1652, "loss": 0.1742, "lr": 4.02256601546025e-10, "epoch": 6.993630573248407, "percentage": 99.88, "elapsed_time": "17:26:34", "remaining_time": "0:01:16"}
|
| 331 |
+
{"current_steps": 1652, "total_steps": 1652, "epoch": 7.0, "percentage": 100.0, "elapsed_time": "17:28:51", "remaining_time": "0:00:00"}
|
trainer_state.json
ADDED
|
The diff for this file is too large to render.
See raw diff
|
|
|
training_args.bin
ADDED
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:eda998533ea4719df6866e8efca4b5b05a861d418ecd50e2f8d796176a919707
|
| 3 |
+
size 8721
|
training_loss.png
ADDED
|
vocab.json
ADDED
|
The diff for this file is too large to render.
See raw diff
|
|
|