Upload my folder using huggingface_hub
Browse filesThis view is limited to 50 files because it contains too many changes.
See raw diff
- .gitattributes +8 -0
- EXP_1.1_3b/README.md +69 -0
- EXP_1.1_3b/added_tokens.json +24 -0
- EXP_1.1_3b/all_results.json +12 -0
- EXP_1.1_3b/chat_template.jinja +7 -0
- EXP_1.1_3b/config.json +105 -0
- EXP_1.1_3b/eval_results.json +7 -0
- EXP_1.1_3b/generation_config.json +12 -0
- EXP_1.1_3b/merges.txt +0 -0
- EXP_1.1_3b/model-00001-of-00002.safetensors +3 -0
- EXP_1.1_3b/model-00002-of-00002.safetensors +3 -0
- EXP_1.1_3b/model.safetensors.index.json +831 -0
- EXP_1.1_3b/preprocessor_config.json +29 -0
- EXP_1.1_3b/special_tokens_map.json +31 -0
- EXP_1.1_3b/swanlab_public_config.json +13 -0
- EXP_1.1_3b/tokenizer.json +3 -0
- EXP_1.1_3b/tokenizer_config.json +209 -0
- EXP_1.1_3b/train_results.json +8 -0
- EXP_1.1_3b/trainer_log.jsonl +248 -0
- EXP_1.1_3b/trainer_state.json +1776 -0
- EXP_1.1_3b/training_args.bin +3 -0
- EXP_1.1_3b/training_eval_loss.png +0 -0
- EXP_1.1_3b/training_loss.png +0 -0
- EXP_1.1_3b/video_preprocessor_config.json +86 -0
- EXP_1.1_3b/vocab.json +0 -0
- EXP_1.2_3b/README.md +63 -0
- EXP_1.2_3b/added_tokens.json +24 -0
- EXP_1.2_3b/all_results.json +12 -0
- EXP_1.2_3b/chat_template.jinja +7 -0
- EXP_1.2_3b/config.json +105 -0
- EXP_1.2_3b/eval_results.json +7 -0
- EXP_1.2_3b/generation_config.json +12 -0
- EXP_1.2_3b/merges.txt +0 -0
- EXP_1.2_3b/model-00001-of-00002.safetensors +3 -0
- EXP_1.2_3b/model-00002-of-00002.safetensors +3 -0
- EXP_1.2_3b/model.safetensors.index.json +831 -0
- EXP_1.2_3b/preprocessor_config.json +29 -0
- EXP_1.2_3b/special_tokens_map.json +31 -0
- EXP_1.2_3b/swanlab_public_config.json +13 -0
- EXP_1.2_3b/tokenizer.json +3 -0
- EXP_1.2_3b/tokenizer_config.json +209 -0
- EXP_1.2_3b/train_results.json +8 -0
- EXP_1.2_3b/trainer_log.jsonl +33 -0
- EXP_1.2_3b/trainer_state.json +267 -0
- EXP_1.2_3b/training_args.bin +3 -0
- EXP_1.2_3b/training_loss.png +0 -0
- EXP_1.2_3b/video_preprocessor_config.json +86 -0
- EXP_1.2_3b/vocab.json +0 -0
- EXP_2.1_3b/README.md +69 -0
- EXP_2.1_3b/added_tokens.json +24 -0
.gitattributes
CHANGED
|
@@ -33,3 +33,11 @@ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
|
|
| 33 |
*.zip filter=lfs diff=lfs merge=lfs -text
|
| 34 |
*.zst filter=lfs diff=lfs merge=lfs -text
|
| 35 |
*tfevents* filter=lfs diff=lfs merge=lfs -text
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 33 |
*.zip filter=lfs diff=lfs merge=lfs -text
|
| 34 |
*.zst filter=lfs diff=lfs merge=lfs -text
|
| 35 |
*tfevents* filter=lfs diff=lfs merge=lfs -text
|
| 36 |
+
EXP_1.1_3b/tokenizer.json filter=lfs diff=lfs merge=lfs -text
|
| 37 |
+
EXP_1.2_3b/tokenizer.json filter=lfs diff=lfs merge=lfs -text
|
| 38 |
+
EXP_2.1_3b/tokenizer.json filter=lfs diff=lfs merge=lfs -text
|
| 39 |
+
EXP_3.1_3b/tokenizer.json filter=lfs diff=lfs merge=lfs -text
|
| 40 |
+
EXP_3.2_3b/tokenizer.json filter=lfs diff=lfs merge=lfs -text
|
| 41 |
+
EXP_4.1_3b/tokenizer.json filter=lfs diff=lfs merge=lfs -text
|
| 42 |
+
EXP_4.2_3b/tokenizer.json filter=lfs diff=lfs merge=lfs -text
|
| 43 |
+
EXP_4.3_3b/tokenizer.json filter=lfs diff=lfs merge=lfs -text
|
EXP_1.1_3b/README.md
ADDED
|
@@ -0,0 +1,69 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
---
|
| 2 |
+
library_name: transformers
|
| 3 |
+
license: other
|
| 4 |
+
base_model: /mnt/nvme/hyz/hf/models/Qwen2.5-VL-3B-Instruct
|
| 5 |
+
tags:
|
| 6 |
+
- llama-factory
|
| 7 |
+
- full
|
| 8 |
+
- generated_from_trainer
|
| 9 |
+
model-index:
|
| 10 |
+
- name: EXP_1.1_3b
|
| 11 |
+
results: []
|
| 12 |
+
---
|
| 13 |
+
|
| 14 |
+
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
| 15 |
+
should probably proofread and complete it, then remove this comment. -->
|
| 16 |
+
|
| 17 |
+
# EXP_1.1_3b
|
| 18 |
+
|
| 19 |
+
This model is a fine-tuned version of [/mnt/nvme/hyz/hf/models/Qwen2.5-VL-3B-Instruct](https://huggingface.co//mnt/nvme/hyz/hf/models/Qwen2.5-VL-3B-Instruct) on the multimodal-open-r1-8k-verified_vision-r1 dataset.
|
| 20 |
+
It achieves the following results on the evaluation set:
|
| 21 |
+
- Loss: 0.6923
|
| 22 |
+
|
| 23 |
+
## Model description
|
| 24 |
+
|
| 25 |
+
More information needed
|
| 26 |
+
|
| 27 |
+
## Intended uses & limitations
|
| 28 |
+
|
| 29 |
+
More information needed
|
| 30 |
+
|
| 31 |
+
## Training and evaluation data
|
| 32 |
+
|
| 33 |
+
More information needed
|
| 34 |
+
|
| 35 |
+
## Training procedure
|
| 36 |
+
|
| 37 |
+
### Training hyperparameters
|
| 38 |
+
|
| 39 |
+
The following hyperparameters were used during training:
|
| 40 |
+
- learning_rate: 1e-05
|
| 41 |
+
- train_batch_size: 4
|
| 42 |
+
- eval_batch_size: 1
|
| 43 |
+
- seed: 42
|
| 44 |
+
- distributed_type: multi-GPU
|
| 45 |
+
- num_devices: 8
|
| 46 |
+
- gradient_accumulation_steps: 2
|
| 47 |
+
- total_train_batch_size: 64
|
| 48 |
+
- total_eval_batch_size: 8
|
| 49 |
+
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
|
| 50 |
+
- lr_scheduler_type: cosine
|
| 51 |
+
- lr_scheduler_warmup_ratio: 0.1
|
| 52 |
+
- num_epochs: 3.0
|
| 53 |
+
|
| 54 |
+
### Training results
|
| 55 |
+
|
| 56 |
+
| Training Loss | Epoch | Step | Validation Loss |
|
| 57 |
+
|:-------------:|:------:|:----:|:---------------:|
|
| 58 |
+
| 0.705 | 0.6161 | 500 | 0.7098 |
|
| 59 |
+
| 0.5925 | 1.2317 | 1000 | 0.6842 |
|
| 60 |
+
| 0.5739 | 1.8478 | 1500 | 0.6650 |
|
| 61 |
+
| 0.4637 | 2.4633 | 2000 | 0.6939 |
|
| 62 |
+
|
| 63 |
+
|
| 64 |
+
### Framework versions
|
| 65 |
+
|
| 66 |
+
- Transformers 4.52.4
|
| 67 |
+
- Pytorch 2.7.1+cu126
|
| 68 |
+
- Datasets 3.6.0
|
| 69 |
+
- Tokenizers 0.21.1
|
EXP_1.1_3b/added_tokens.json
ADDED
|
@@ -0,0 +1,24 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"</tool_call>": 151658,
|
| 3 |
+
"<tool_call>": 151657,
|
| 4 |
+
"<|box_end|>": 151649,
|
| 5 |
+
"<|box_start|>": 151648,
|
| 6 |
+
"<|endoftext|>": 151643,
|
| 7 |
+
"<|file_sep|>": 151664,
|
| 8 |
+
"<|fim_middle|>": 151660,
|
| 9 |
+
"<|fim_pad|>": 151662,
|
| 10 |
+
"<|fim_prefix|>": 151659,
|
| 11 |
+
"<|fim_suffix|>": 151661,
|
| 12 |
+
"<|im_end|>": 151645,
|
| 13 |
+
"<|im_start|>": 151644,
|
| 14 |
+
"<|image_pad|>": 151655,
|
| 15 |
+
"<|object_ref_end|>": 151647,
|
| 16 |
+
"<|object_ref_start|>": 151646,
|
| 17 |
+
"<|quad_end|>": 151651,
|
| 18 |
+
"<|quad_start|>": 151650,
|
| 19 |
+
"<|repo_name|>": 151663,
|
| 20 |
+
"<|video_pad|>": 151656,
|
| 21 |
+
"<|vision_end|>": 151653,
|
| 22 |
+
"<|vision_pad|>": 151654,
|
| 23 |
+
"<|vision_start|>": 151652
|
| 24 |
+
}
|
EXP_1.1_3b/all_results.json
ADDED
|
@@ -0,0 +1,12 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"epoch": 3.0,
|
| 3 |
+
"eval_loss": 0.6922884583473206,
|
| 4 |
+
"eval_runtime": 292.8509,
|
| 5 |
+
"eval_samples_per_second": 19.699,
|
| 6 |
+
"eval_steps_per_second": 2.465,
|
| 7 |
+
"total_flos": 598743726292992.0,
|
| 8 |
+
"train_loss": 0.6124914906099317,
|
| 9 |
+
"train_runtime": 9399.7215,
|
| 10 |
+
"train_samples_per_second": 16.571,
|
| 11 |
+
"train_steps_per_second": 0.259
|
| 12 |
+
}
|
EXP_1.1_3b/chat_template.jinja
ADDED
|
@@ -0,0 +1,7 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{% set image_count = namespace(value=0) %}{% set video_count = namespace(value=0) %}{% for message in messages %}{% if loop.first and message['role'] != 'system' %}<|im_start|>system
|
| 2 |
+
You are a helpful assistant.<|im_end|>
|
| 3 |
+
{% endif %}<|im_start|>{{ message['role'] }}
|
| 4 |
+
{% if message['content'] is string %}{{ message['content'] }}<|im_end|>
|
| 5 |
+
{% else %}{% for content in message['content'] %}{% if content['type'] == 'image' or 'image' in content or 'image_url' in content %}{% set image_count.value = image_count.value + 1 %}{% if add_vision_id %}Picture {{ image_count.value }}: {% endif %}<|vision_start|><|image_pad|><|vision_end|>{% elif content['type'] == 'video' or 'video' in content %}{% set video_count.value = video_count.value + 1 %}{% if add_vision_id %}Video {{ video_count.value }}: {% endif %}<|vision_start|><|video_pad|><|vision_end|>{% elif 'text' in content %}{{ content['text'] }}{% endif %}{% endfor %}<|im_end|>
|
| 6 |
+
{% endif %}{% endfor %}{% if add_generation_prompt %}<|im_start|>assistant
|
| 7 |
+
{% endif %}
|
EXP_1.1_3b/config.json
ADDED
|
@@ -0,0 +1,105 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"architectures": [
|
| 3 |
+
"Qwen2_5_VLForConditionalGeneration"
|
| 4 |
+
],
|
| 5 |
+
"attention_dropout": 0.0,
|
| 6 |
+
"bos_token_id": 151643,
|
| 7 |
+
"eos_token_id": 151645,
|
| 8 |
+
"hidden_act": "silu",
|
| 9 |
+
"hidden_size": 2048,
|
| 10 |
+
"image_token_id": 151655,
|
| 11 |
+
"initializer_range": 0.02,
|
| 12 |
+
"intermediate_size": 11008,
|
| 13 |
+
"max_position_embeddings": 128000,
|
| 14 |
+
"max_window_layers": 70,
|
| 15 |
+
"model_type": "qwen2_5_vl",
|
| 16 |
+
"num_attention_heads": 16,
|
| 17 |
+
"num_hidden_layers": 36,
|
| 18 |
+
"num_key_value_heads": 2,
|
| 19 |
+
"rms_norm_eps": 1e-06,
|
| 20 |
+
"rope_scaling": {
|
| 21 |
+
"mrope_section": [
|
| 22 |
+
16,
|
| 23 |
+
24,
|
| 24 |
+
24
|
| 25 |
+
],
|
| 26 |
+
"rope_type": "default",
|
| 27 |
+
"type": "default"
|
| 28 |
+
},
|
| 29 |
+
"rope_theta": 1000000.0,
|
| 30 |
+
"sliding_window": 32768,
|
| 31 |
+
"text_config": {
|
| 32 |
+
"architectures": [
|
| 33 |
+
"Qwen2_5_VLForConditionalGeneration"
|
| 34 |
+
],
|
| 35 |
+
"attention_dropout": 0.0,
|
| 36 |
+
"bos_token_id": 151643,
|
| 37 |
+
"eos_token_id": 151645,
|
| 38 |
+
"hidden_act": "silu",
|
| 39 |
+
"hidden_size": 2048,
|
| 40 |
+
"image_token_id": null,
|
| 41 |
+
"initializer_range": 0.02,
|
| 42 |
+
"intermediate_size": 11008,
|
| 43 |
+
"max_position_embeddings": 128000,
|
| 44 |
+
"max_window_layers": 70,
|
| 45 |
+
"model_type": "qwen2_5_vl_text",
|
| 46 |
+
"num_attention_heads": 16,
|
| 47 |
+
"num_hidden_layers": 36,
|
| 48 |
+
"num_key_value_heads": 2,
|
| 49 |
+
"rms_norm_eps": 1e-06,
|
| 50 |
+
"rope_scaling": {
|
| 51 |
+
"mrope_section": [
|
| 52 |
+
16,
|
| 53 |
+
24,
|
| 54 |
+
24
|
| 55 |
+
],
|
| 56 |
+
"rope_type": "default",
|
| 57 |
+
"type": "default"
|
| 58 |
+
},
|
| 59 |
+
"rope_theta": 1000000.0,
|
| 60 |
+
"sliding_window": 32768,
|
| 61 |
+
"tie_word_embeddings": true,
|
| 62 |
+
"torch_dtype": "float32",
|
| 63 |
+
"use_cache": false,
|
| 64 |
+
"use_sliding_window": false,
|
| 65 |
+
"video_token_id": null,
|
| 66 |
+
"vision_end_token_id": 151653,
|
| 67 |
+
"vision_start_token_id": 151652,
|
| 68 |
+
"vision_token_id": 151654,
|
| 69 |
+
"vocab_size": 151936
|
| 70 |
+
},
|
| 71 |
+
"torch_dtype": "bfloat16",
|
| 72 |
+
"transformers_version": "4.52.4",
|
| 73 |
+
"use_cache": false,
|
| 74 |
+
"use_sliding_window": false,
|
| 75 |
+
"video_token_id": 151656,
|
| 76 |
+
"vision_config": {
|
| 77 |
+
"depth": 32,
|
| 78 |
+
"fullatt_block_indexes": [
|
| 79 |
+
7,
|
| 80 |
+
15,
|
| 81 |
+
23,
|
| 82 |
+
31
|
| 83 |
+
],
|
| 84 |
+
"hidden_act": "silu",
|
| 85 |
+
"hidden_size": 1280,
|
| 86 |
+
"in_channels": 3,
|
| 87 |
+
"in_chans": 3,
|
| 88 |
+
"initializer_range": 0.02,
|
| 89 |
+
"intermediate_size": 3420,
|
| 90 |
+
"model_type": "qwen2_5_vl",
|
| 91 |
+
"num_heads": 16,
|
| 92 |
+
"out_hidden_size": 2048,
|
| 93 |
+
"patch_size": 14,
|
| 94 |
+
"spatial_merge_size": 2,
|
| 95 |
+
"spatial_patch_size": 14,
|
| 96 |
+
"temporal_patch_size": 2,
|
| 97 |
+
"tokens_per_second": 2,
|
| 98 |
+
"torch_dtype": "float32",
|
| 99 |
+
"window_size": 112
|
| 100 |
+
},
|
| 101 |
+
"vision_end_token_id": 151653,
|
| 102 |
+
"vision_start_token_id": 151652,
|
| 103 |
+
"vision_token_id": 151654,
|
| 104 |
+
"vocab_size": 151936
|
| 105 |
+
}
|
EXP_1.1_3b/eval_results.json
ADDED
|
@@ -0,0 +1,7 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"epoch": 3.0,
|
| 3 |
+
"eval_loss": 0.6922884583473206,
|
| 4 |
+
"eval_runtime": 292.8509,
|
| 5 |
+
"eval_samples_per_second": 19.699,
|
| 6 |
+
"eval_steps_per_second": 2.465
|
| 7 |
+
}
|
EXP_1.1_3b/generation_config.json
ADDED
|
@@ -0,0 +1,12 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"bos_token_id": 151643,
|
| 3 |
+
"do_sample": true,
|
| 4 |
+
"eos_token_id": [
|
| 5 |
+
151645,
|
| 6 |
+
151643
|
| 7 |
+
],
|
| 8 |
+
"pad_token_id": 151643,
|
| 9 |
+
"repetition_penalty": 1.05,
|
| 10 |
+
"temperature": 1e-06,
|
| 11 |
+
"transformers_version": "4.52.4"
|
| 12 |
+
}
|
EXP_1.1_3b/merges.txt
ADDED
|
The diff for this file is too large to render.
See raw diff
|
|
|
EXP_1.1_3b/model-00001-of-00002.safetensors
ADDED
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:feef36fe2a7a4052196322b7ed49c3067cda293d7686323b4a8afd6604364848
|
| 3 |
+
size 4997750760
|
EXP_1.1_3b/model-00002-of-00002.safetensors
ADDED
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:d9cdcaf4c1e4c5d2e450d5c9a6cb01c20c7ece20c9500874b86ec6f6a582403c
|
| 3 |
+
size 2511587184
|
EXP_1.1_3b/model.safetensors.index.json
ADDED
|
@@ -0,0 +1,831 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"metadata": {
|
| 3 |
+
"total_size": 7509245952
|
| 4 |
+
},
|
| 5 |
+
"weight_map": {
|
| 6 |
+
"model.embed_tokens.weight": "model-00001-of-00002.safetensors",
|
| 7 |
+
"model.layers.0.input_layernorm.weight": "model-00001-of-00002.safetensors",
|
| 8 |
+
"model.layers.0.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 9 |
+
"model.layers.0.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 10 |
+
"model.layers.0.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 11 |
+
"model.layers.0.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
|
| 12 |
+
"model.layers.0.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
|
| 13 |
+
"model.layers.0.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
|
| 14 |
+
"model.layers.0.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
|
| 15 |
+
"model.layers.0.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
|
| 16 |
+
"model.layers.0.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
|
| 17 |
+
"model.layers.0.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
|
| 18 |
+
"model.layers.0.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
|
| 19 |
+
"model.layers.1.input_layernorm.weight": "model-00001-of-00002.safetensors",
|
| 20 |
+
"model.layers.1.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 21 |
+
"model.layers.1.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 22 |
+
"model.layers.1.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 23 |
+
"model.layers.1.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
|
| 24 |
+
"model.layers.1.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
|
| 25 |
+
"model.layers.1.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
|
| 26 |
+
"model.layers.1.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
|
| 27 |
+
"model.layers.1.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
|
| 28 |
+
"model.layers.1.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
|
| 29 |
+
"model.layers.1.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
|
| 30 |
+
"model.layers.1.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
|
| 31 |
+
"model.layers.10.input_layernorm.weight": "model-00001-of-00002.safetensors",
|
| 32 |
+
"model.layers.10.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 33 |
+
"model.layers.10.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 34 |
+
"model.layers.10.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 35 |
+
"model.layers.10.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
|
| 36 |
+
"model.layers.10.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
|
| 37 |
+
"model.layers.10.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
|
| 38 |
+
"model.layers.10.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
|
| 39 |
+
"model.layers.10.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
|
| 40 |
+
"model.layers.10.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
|
| 41 |
+
"model.layers.10.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
|
| 42 |
+
"model.layers.10.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
|
| 43 |
+
"model.layers.11.input_layernorm.weight": "model-00001-of-00002.safetensors",
|
| 44 |
+
"model.layers.11.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 45 |
+
"model.layers.11.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 46 |
+
"model.layers.11.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 47 |
+
"model.layers.11.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
|
| 48 |
+
"model.layers.11.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
|
| 49 |
+
"model.layers.11.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
|
| 50 |
+
"model.layers.11.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
|
| 51 |
+
"model.layers.11.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
|
| 52 |
+
"model.layers.11.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
|
| 53 |
+
"model.layers.11.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
|
| 54 |
+
"model.layers.11.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
|
| 55 |
+
"model.layers.12.input_layernorm.weight": "model-00001-of-00002.safetensors",
|
| 56 |
+
"model.layers.12.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 57 |
+
"model.layers.12.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 58 |
+
"model.layers.12.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 59 |
+
"model.layers.12.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
|
| 60 |
+
"model.layers.12.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
|
| 61 |
+
"model.layers.12.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
|
| 62 |
+
"model.layers.12.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
|
| 63 |
+
"model.layers.12.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
|
| 64 |
+
"model.layers.12.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
|
| 65 |
+
"model.layers.12.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
|
| 66 |
+
"model.layers.12.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
|
| 67 |
+
"model.layers.13.input_layernorm.weight": "model-00001-of-00002.safetensors",
|
| 68 |
+
"model.layers.13.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 69 |
+
"model.layers.13.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 70 |
+
"model.layers.13.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 71 |
+
"model.layers.13.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
|
| 72 |
+
"model.layers.13.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
|
| 73 |
+
"model.layers.13.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
|
| 74 |
+
"model.layers.13.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
|
| 75 |
+
"model.layers.13.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
|
| 76 |
+
"model.layers.13.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
|
| 77 |
+
"model.layers.13.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
|
| 78 |
+
"model.layers.13.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
|
| 79 |
+
"model.layers.14.input_layernorm.weight": "model-00001-of-00002.safetensors",
|
| 80 |
+
"model.layers.14.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 81 |
+
"model.layers.14.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 82 |
+
"model.layers.14.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 83 |
+
"model.layers.14.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
|
| 84 |
+
"model.layers.14.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
|
| 85 |
+
"model.layers.14.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
|
| 86 |
+
"model.layers.14.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
|
| 87 |
+
"model.layers.14.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
|
| 88 |
+
"model.layers.14.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
|
| 89 |
+
"model.layers.14.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
|
| 90 |
+
"model.layers.14.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
|
| 91 |
+
"model.layers.15.input_layernorm.weight": "model-00001-of-00002.safetensors",
|
| 92 |
+
"model.layers.15.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 93 |
+
"model.layers.15.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 94 |
+
"model.layers.15.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 95 |
+
"model.layers.15.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
|
| 96 |
+
"model.layers.15.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
|
| 97 |
+
"model.layers.15.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
|
| 98 |
+
"model.layers.15.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
|
| 99 |
+
"model.layers.15.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
|
| 100 |
+
"model.layers.15.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
|
| 101 |
+
"model.layers.15.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
|
| 102 |
+
"model.layers.15.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
|
| 103 |
+
"model.layers.16.input_layernorm.weight": "model-00001-of-00002.safetensors",
|
| 104 |
+
"model.layers.16.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 105 |
+
"model.layers.16.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 106 |
+
"model.layers.16.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 107 |
+
"model.layers.16.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
|
| 108 |
+
"model.layers.16.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
|
| 109 |
+
"model.layers.16.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
|
| 110 |
+
"model.layers.16.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
|
| 111 |
+
"model.layers.16.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
|
| 112 |
+
"model.layers.16.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
|
| 113 |
+
"model.layers.16.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
|
| 114 |
+
"model.layers.16.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
|
| 115 |
+
"model.layers.17.input_layernorm.weight": "model-00001-of-00002.safetensors",
|
| 116 |
+
"model.layers.17.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 117 |
+
"model.layers.17.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 118 |
+
"model.layers.17.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 119 |
+
"model.layers.17.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
|
| 120 |
+
"model.layers.17.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
|
| 121 |
+
"model.layers.17.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
|
| 122 |
+
"model.layers.17.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
|
| 123 |
+
"model.layers.17.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
|
| 124 |
+
"model.layers.17.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
|
| 125 |
+
"model.layers.17.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
|
| 126 |
+
"model.layers.17.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
|
| 127 |
+
"model.layers.18.input_layernorm.weight": "model-00001-of-00002.safetensors",
|
| 128 |
+
"model.layers.18.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 129 |
+
"model.layers.18.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 130 |
+
"model.layers.18.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 131 |
+
"model.layers.18.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
|
| 132 |
+
"model.layers.18.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
|
| 133 |
+
"model.layers.18.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
|
| 134 |
+
"model.layers.18.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
|
| 135 |
+
"model.layers.18.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
|
| 136 |
+
"model.layers.18.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
|
| 137 |
+
"model.layers.18.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
|
| 138 |
+
"model.layers.18.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
|
| 139 |
+
"model.layers.19.input_layernorm.weight": "model-00002-of-00002.safetensors",
|
| 140 |
+
"model.layers.19.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
|
| 141 |
+
"model.layers.19.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 142 |
+
"model.layers.19.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 143 |
+
"model.layers.19.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
|
| 144 |
+
"model.layers.19.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
|
| 145 |
+
"model.layers.19.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
|
| 146 |
+
"model.layers.19.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
|
| 147 |
+
"model.layers.19.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
|
| 148 |
+
"model.layers.19.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
|
| 149 |
+
"model.layers.19.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
|
| 150 |
+
"model.layers.19.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
|
| 151 |
+
"model.layers.2.input_layernorm.weight": "model-00001-of-00002.safetensors",
|
| 152 |
+
"model.layers.2.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 153 |
+
"model.layers.2.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 154 |
+
"model.layers.2.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 155 |
+
"model.layers.2.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
|
| 156 |
+
"model.layers.2.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
|
| 157 |
+
"model.layers.2.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
|
| 158 |
+
"model.layers.2.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
|
| 159 |
+
"model.layers.2.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
|
| 160 |
+
"model.layers.2.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
|
| 161 |
+
"model.layers.2.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
|
| 162 |
+
"model.layers.2.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
|
| 163 |
+
"model.layers.20.input_layernorm.weight": "model-00002-of-00002.safetensors",
|
| 164 |
+
"model.layers.20.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
|
| 165 |
+
"model.layers.20.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
|
| 166 |
+
"model.layers.20.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
|
| 167 |
+
"model.layers.20.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
|
| 168 |
+
"model.layers.20.self_attn.k_proj.bias": "model-00002-of-00002.safetensors",
|
| 169 |
+
"model.layers.20.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
|
| 170 |
+
"model.layers.20.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
|
| 171 |
+
"model.layers.20.self_attn.q_proj.bias": "model-00002-of-00002.safetensors",
|
| 172 |
+
"model.layers.20.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
|
| 173 |
+
"model.layers.20.self_attn.v_proj.bias": "model-00002-of-00002.safetensors",
|
| 174 |
+
"model.layers.20.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
|
| 175 |
+
"model.layers.21.input_layernorm.weight": "model-00002-of-00002.safetensors",
|
| 176 |
+
"model.layers.21.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
|
| 177 |
+
"model.layers.21.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
|
| 178 |
+
"model.layers.21.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
|
| 179 |
+
"model.layers.21.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
|
| 180 |
+
"model.layers.21.self_attn.k_proj.bias": "model-00002-of-00002.safetensors",
|
| 181 |
+
"model.layers.21.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
|
| 182 |
+
"model.layers.21.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
|
| 183 |
+
"model.layers.21.self_attn.q_proj.bias": "model-00002-of-00002.safetensors",
|
| 184 |
+
"model.layers.21.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
|
| 185 |
+
"model.layers.21.self_attn.v_proj.bias": "model-00002-of-00002.safetensors",
|
| 186 |
+
"model.layers.21.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
|
| 187 |
+
"model.layers.22.input_layernorm.weight": "model-00002-of-00002.safetensors",
|
| 188 |
+
"model.layers.22.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
|
| 189 |
+
"model.layers.22.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
|
| 190 |
+
"model.layers.22.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
|
| 191 |
+
"model.layers.22.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
|
| 192 |
+
"model.layers.22.self_attn.k_proj.bias": "model-00002-of-00002.safetensors",
|
| 193 |
+
"model.layers.22.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
|
| 194 |
+
"model.layers.22.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
|
| 195 |
+
"model.layers.22.self_attn.q_proj.bias": "model-00002-of-00002.safetensors",
|
| 196 |
+
"model.layers.22.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
|
| 197 |
+
"model.layers.22.self_attn.v_proj.bias": "model-00002-of-00002.safetensors",
|
| 198 |
+
"model.layers.22.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
|
| 199 |
+
"model.layers.23.input_layernorm.weight": "model-00002-of-00002.safetensors",
|
| 200 |
+
"model.layers.23.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
|
| 201 |
+
"model.layers.23.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
|
| 202 |
+
"model.layers.23.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
|
| 203 |
+
"model.layers.23.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
|
| 204 |
+
"model.layers.23.self_attn.k_proj.bias": "model-00002-of-00002.safetensors",
|
| 205 |
+
"model.layers.23.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
|
| 206 |
+
"model.layers.23.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
|
| 207 |
+
"model.layers.23.self_attn.q_proj.bias": "model-00002-of-00002.safetensors",
|
| 208 |
+
"model.layers.23.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
|
| 209 |
+
"model.layers.23.self_attn.v_proj.bias": "model-00002-of-00002.safetensors",
|
| 210 |
+
"model.layers.23.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
|
| 211 |
+
"model.layers.24.input_layernorm.weight": "model-00002-of-00002.safetensors",
|
| 212 |
+
"model.layers.24.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
|
| 213 |
+
"model.layers.24.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
|
| 214 |
+
"model.layers.24.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
|
| 215 |
+
"model.layers.24.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
|
| 216 |
+
"model.layers.24.self_attn.k_proj.bias": "model-00002-of-00002.safetensors",
|
| 217 |
+
"model.layers.24.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
|
| 218 |
+
"model.layers.24.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
|
| 219 |
+
"model.layers.24.self_attn.q_proj.bias": "model-00002-of-00002.safetensors",
|
| 220 |
+
"model.layers.24.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
|
| 221 |
+
"model.layers.24.self_attn.v_proj.bias": "model-00002-of-00002.safetensors",
|
| 222 |
+
"model.layers.24.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
|
| 223 |
+
"model.layers.25.input_layernorm.weight": "model-00002-of-00002.safetensors",
|
| 224 |
+
"model.layers.25.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
|
| 225 |
+
"model.layers.25.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
|
| 226 |
+
"model.layers.25.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
|
| 227 |
+
"model.layers.25.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
|
| 228 |
+
"model.layers.25.self_attn.k_proj.bias": "model-00002-of-00002.safetensors",
|
| 229 |
+
"model.layers.25.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
|
| 230 |
+
"model.layers.25.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
|
| 231 |
+
"model.layers.25.self_attn.q_proj.bias": "model-00002-of-00002.safetensors",
|
| 232 |
+
"model.layers.25.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
|
| 233 |
+
"model.layers.25.self_attn.v_proj.bias": "model-00002-of-00002.safetensors",
|
| 234 |
+
"model.layers.25.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
|
| 235 |
+
"model.layers.26.input_layernorm.weight": "model-00002-of-00002.safetensors",
|
| 236 |
+
"model.layers.26.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
|
| 237 |
+
"model.layers.26.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
|
| 238 |
+
"model.layers.26.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
|
| 239 |
+
"model.layers.26.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
|
| 240 |
+
"model.layers.26.self_attn.k_proj.bias": "model-00002-of-00002.safetensors",
|
| 241 |
+
"model.layers.26.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
|
| 242 |
+
"model.layers.26.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
|
| 243 |
+
"model.layers.26.self_attn.q_proj.bias": "model-00002-of-00002.safetensors",
|
| 244 |
+
"model.layers.26.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
|
| 245 |
+
"model.layers.26.self_attn.v_proj.bias": "model-00002-of-00002.safetensors",
|
| 246 |
+
"model.layers.26.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
|
| 247 |
+
"model.layers.27.input_layernorm.weight": "model-00002-of-00002.safetensors",
|
| 248 |
+
"model.layers.27.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
|
| 249 |
+
"model.layers.27.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
|
| 250 |
+
"model.layers.27.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
|
| 251 |
+
"model.layers.27.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
|
| 252 |
+
"model.layers.27.self_attn.k_proj.bias": "model-00002-of-00002.safetensors",
|
| 253 |
+
"model.layers.27.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
|
| 254 |
+
"model.layers.27.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
|
| 255 |
+
"model.layers.27.self_attn.q_proj.bias": "model-00002-of-00002.safetensors",
|
| 256 |
+
"model.layers.27.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
|
| 257 |
+
"model.layers.27.self_attn.v_proj.bias": "model-00002-of-00002.safetensors",
|
| 258 |
+
"model.layers.27.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
|
| 259 |
+
"model.layers.28.input_layernorm.weight": "model-00002-of-00002.safetensors",
|
| 260 |
+
"model.layers.28.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
|
| 261 |
+
"model.layers.28.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
|
| 262 |
+
"model.layers.28.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
|
| 263 |
+
"model.layers.28.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
|
| 264 |
+
"model.layers.28.self_attn.k_proj.bias": "model-00002-of-00002.safetensors",
|
| 265 |
+
"model.layers.28.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
|
| 266 |
+
"model.layers.28.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
|
| 267 |
+
"model.layers.28.self_attn.q_proj.bias": "model-00002-of-00002.safetensors",
|
| 268 |
+
"model.layers.28.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
|
| 269 |
+
"model.layers.28.self_attn.v_proj.bias": "model-00002-of-00002.safetensors",
|
| 270 |
+
"model.layers.28.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
|
| 271 |
+
"model.layers.29.input_layernorm.weight": "model-00002-of-00002.safetensors",
|
| 272 |
+
"model.layers.29.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
|
| 273 |
+
"model.layers.29.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
|
| 274 |
+
"model.layers.29.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
|
| 275 |
+
"model.layers.29.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
|
| 276 |
+
"model.layers.29.self_attn.k_proj.bias": "model-00002-of-00002.safetensors",
|
| 277 |
+
"model.layers.29.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
|
| 278 |
+
"model.layers.29.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
|
| 279 |
+
"model.layers.29.self_attn.q_proj.bias": "model-00002-of-00002.safetensors",
|
| 280 |
+
"model.layers.29.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
|
| 281 |
+
"model.layers.29.self_attn.v_proj.bias": "model-00002-of-00002.safetensors",
|
| 282 |
+
"model.layers.29.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
|
| 283 |
+
"model.layers.3.input_layernorm.weight": "model-00001-of-00002.safetensors",
|
| 284 |
+
"model.layers.3.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 285 |
+
"model.layers.3.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 286 |
+
"model.layers.3.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 287 |
+
"model.layers.3.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
|
| 288 |
+
"model.layers.3.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
|
| 289 |
+
"model.layers.3.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
|
| 290 |
+
"model.layers.3.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
|
| 291 |
+
"model.layers.3.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
|
| 292 |
+
"model.layers.3.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
|
| 293 |
+
"model.layers.3.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
|
| 294 |
+
"model.layers.3.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
|
| 295 |
+
"model.layers.30.input_layernorm.weight": "model-00002-of-00002.safetensors",
|
| 296 |
+
"model.layers.30.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
|
| 297 |
+
"model.layers.30.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
|
| 298 |
+
"model.layers.30.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
|
| 299 |
+
"model.layers.30.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
|
| 300 |
+
"model.layers.30.self_attn.k_proj.bias": "model-00002-of-00002.safetensors",
|
| 301 |
+
"model.layers.30.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
|
| 302 |
+
"model.layers.30.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
|
| 303 |
+
"model.layers.30.self_attn.q_proj.bias": "model-00002-of-00002.safetensors",
|
| 304 |
+
"model.layers.30.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
|
| 305 |
+
"model.layers.30.self_attn.v_proj.bias": "model-00002-of-00002.safetensors",
|
| 306 |
+
"model.layers.30.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
|
| 307 |
+
"model.layers.31.input_layernorm.weight": "model-00002-of-00002.safetensors",
|
| 308 |
+
"model.layers.31.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
|
| 309 |
+
"model.layers.31.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
|
| 310 |
+
"model.layers.31.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
|
| 311 |
+
"model.layers.31.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
|
| 312 |
+
"model.layers.31.self_attn.k_proj.bias": "model-00002-of-00002.safetensors",
|
| 313 |
+
"model.layers.31.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
|
| 314 |
+
"model.layers.31.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
|
| 315 |
+
"model.layers.31.self_attn.q_proj.bias": "model-00002-of-00002.safetensors",
|
| 316 |
+
"model.layers.31.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
|
| 317 |
+
"model.layers.31.self_attn.v_proj.bias": "model-00002-of-00002.safetensors",
|
| 318 |
+
"model.layers.31.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
|
| 319 |
+
"model.layers.32.input_layernorm.weight": "model-00002-of-00002.safetensors",
|
| 320 |
+
"model.layers.32.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
|
| 321 |
+
"model.layers.32.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
|
| 322 |
+
"model.layers.32.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
|
| 323 |
+
"model.layers.32.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
|
| 324 |
+
"model.layers.32.self_attn.k_proj.bias": "model-00002-of-00002.safetensors",
|
| 325 |
+
"model.layers.32.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
|
| 326 |
+
"model.layers.32.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
|
| 327 |
+
"model.layers.32.self_attn.q_proj.bias": "model-00002-of-00002.safetensors",
|
| 328 |
+
"model.layers.32.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
|
| 329 |
+
"model.layers.32.self_attn.v_proj.bias": "model-00002-of-00002.safetensors",
|
| 330 |
+
"model.layers.32.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
|
| 331 |
+
"model.layers.33.input_layernorm.weight": "model-00002-of-00002.safetensors",
|
| 332 |
+
"model.layers.33.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
|
| 333 |
+
"model.layers.33.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
|
| 334 |
+
"model.layers.33.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
|
| 335 |
+
"model.layers.33.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
|
| 336 |
+
"model.layers.33.self_attn.k_proj.bias": "model-00002-of-00002.safetensors",
|
| 337 |
+
"model.layers.33.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
|
| 338 |
+
"model.layers.33.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
|
| 339 |
+
"model.layers.33.self_attn.q_proj.bias": "model-00002-of-00002.safetensors",
|
| 340 |
+
"model.layers.33.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
|
| 341 |
+
"model.layers.33.self_attn.v_proj.bias": "model-00002-of-00002.safetensors",
|
| 342 |
+
"model.layers.33.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
|
| 343 |
+
"model.layers.34.input_layernorm.weight": "model-00002-of-00002.safetensors",
|
| 344 |
+
"model.layers.34.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
|
| 345 |
+
"model.layers.34.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
|
| 346 |
+
"model.layers.34.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
|
| 347 |
+
"model.layers.34.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
|
| 348 |
+
"model.layers.34.self_attn.k_proj.bias": "model-00002-of-00002.safetensors",
|
| 349 |
+
"model.layers.34.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
|
| 350 |
+
"model.layers.34.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
|
| 351 |
+
"model.layers.34.self_attn.q_proj.bias": "model-00002-of-00002.safetensors",
|
| 352 |
+
"model.layers.34.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
|
| 353 |
+
"model.layers.34.self_attn.v_proj.bias": "model-00002-of-00002.safetensors",
|
| 354 |
+
"model.layers.34.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
|
| 355 |
+
"model.layers.35.input_layernorm.weight": "model-00002-of-00002.safetensors",
|
| 356 |
+
"model.layers.35.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
|
| 357 |
+
"model.layers.35.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
|
| 358 |
+
"model.layers.35.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
|
| 359 |
+
"model.layers.35.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
|
| 360 |
+
"model.layers.35.self_attn.k_proj.bias": "model-00002-of-00002.safetensors",
|
| 361 |
+
"model.layers.35.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
|
| 362 |
+
"model.layers.35.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
|
| 363 |
+
"model.layers.35.self_attn.q_proj.bias": "model-00002-of-00002.safetensors",
|
| 364 |
+
"model.layers.35.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
|
| 365 |
+
"model.layers.35.self_attn.v_proj.bias": "model-00002-of-00002.safetensors",
|
| 366 |
+
"model.layers.35.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
|
| 367 |
+
"model.layers.4.input_layernorm.weight": "model-00001-of-00002.safetensors",
|
| 368 |
+
"model.layers.4.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 369 |
+
"model.layers.4.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 370 |
+
"model.layers.4.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 371 |
+
"model.layers.4.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
|
| 372 |
+
"model.layers.4.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
|
| 373 |
+
"model.layers.4.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
|
| 374 |
+
"model.layers.4.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
|
| 375 |
+
"model.layers.4.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
|
| 376 |
+
"model.layers.4.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
|
| 377 |
+
"model.layers.4.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
|
| 378 |
+
"model.layers.4.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
|
| 379 |
+
"model.layers.5.input_layernorm.weight": "model-00001-of-00002.safetensors",
|
| 380 |
+
"model.layers.5.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 381 |
+
"model.layers.5.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 382 |
+
"model.layers.5.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 383 |
+
"model.layers.5.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
|
| 384 |
+
"model.layers.5.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
|
| 385 |
+
"model.layers.5.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
|
| 386 |
+
"model.layers.5.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
|
| 387 |
+
"model.layers.5.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
|
| 388 |
+
"model.layers.5.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
|
| 389 |
+
"model.layers.5.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
|
| 390 |
+
"model.layers.5.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
|
| 391 |
+
"model.layers.6.input_layernorm.weight": "model-00001-of-00002.safetensors",
|
| 392 |
+
"model.layers.6.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 393 |
+
"model.layers.6.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 394 |
+
"model.layers.6.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 395 |
+
"model.layers.6.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
|
| 396 |
+
"model.layers.6.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
|
| 397 |
+
"model.layers.6.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
|
| 398 |
+
"model.layers.6.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
|
| 399 |
+
"model.layers.6.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
|
| 400 |
+
"model.layers.6.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
|
| 401 |
+
"model.layers.6.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
|
| 402 |
+
"model.layers.6.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
|
| 403 |
+
"model.layers.7.input_layernorm.weight": "model-00001-of-00002.safetensors",
|
| 404 |
+
"model.layers.7.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 405 |
+
"model.layers.7.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 406 |
+
"model.layers.7.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 407 |
+
"model.layers.7.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
|
| 408 |
+
"model.layers.7.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
|
| 409 |
+
"model.layers.7.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
|
| 410 |
+
"model.layers.7.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
|
| 411 |
+
"model.layers.7.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
|
| 412 |
+
"model.layers.7.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
|
| 413 |
+
"model.layers.7.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
|
| 414 |
+
"model.layers.7.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
|
| 415 |
+
"model.layers.8.input_layernorm.weight": "model-00001-of-00002.safetensors",
|
| 416 |
+
"model.layers.8.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 417 |
+
"model.layers.8.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 418 |
+
"model.layers.8.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 419 |
+
"model.layers.8.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
|
| 420 |
+
"model.layers.8.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
|
| 421 |
+
"model.layers.8.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
|
| 422 |
+
"model.layers.8.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
|
| 423 |
+
"model.layers.8.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
|
| 424 |
+
"model.layers.8.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
|
| 425 |
+
"model.layers.8.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
|
| 426 |
+
"model.layers.8.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
|
| 427 |
+
"model.layers.9.input_layernorm.weight": "model-00001-of-00002.safetensors",
|
| 428 |
+
"model.layers.9.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 429 |
+
"model.layers.9.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 430 |
+
"model.layers.9.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 431 |
+
"model.layers.9.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
|
| 432 |
+
"model.layers.9.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
|
| 433 |
+
"model.layers.9.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
|
| 434 |
+
"model.layers.9.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
|
| 435 |
+
"model.layers.9.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
|
| 436 |
+
"model.layers.9.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
|
| 437 |
+
"model.layers.9.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
|
| 438 |
+
"model.layers.9.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
|
| 439 |
+
"model.norm.weight": "model-00002-of-00002.safetensors",
|
| 440 |
+
"visual.blocks.0.attn.proj.bias": "model-00001-of-00002.safetensors",
|
| 441 |
+
"visual.blocks.0.attn.proj.weight": "model-00001-of-00002.safetensors",
|
| 442 |
+
"visual.blocks.0.attn.qkv.bias": "model-00001-of-00002.safetensors",
|
| 443 |
+
"visual.blocks.0.attn.qkv.weight": "model-00001-of-00002.safetensors",
|
| 444 |
+
"visual.blocks.0.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
|
| 445 |
+
"visual.blocks.0.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 446 |
+
"visual.blocks.0.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
|
| 447 |
+
"visual.blocks.0.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 448 |
+
"visual.blocks.0.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
|
| 449 |
+
"visual.blocks.0.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 450 |
+
"visual.blocks.0.norm1.weight": "model-00001-of-00002.safetensors",
|
| 451 |
+
"visual.blocks.0.norm2.weight": "model-00001-of-00002.safetensors",
|
| 452 |
+
"visual.blocks.1.attn.proj.bias": "model-00001-of-00002.safetensors",
|
| 453 |
+
"visual.blocks.1.attn.proj.weight": "model-00001-of-00002.safetensors",
|
| 454 |
+
"visual.blocks.1.attn.qkv.bias": "model-00001-of-00002.safetensors",
|
| 455 |
+
"visual.blocks.1.attn.qkv.weight": "model-00001-of-00002.safetensors",
|
| 456 |
+
"visual.blocks.1.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
|
| 457 |
+
"visual.blocks.1.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 458 |
+
"visual.blocks.1.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
|
| 459 |
+
"visual.blocks.1.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 460 |
+
"visual.blocks.1.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
|
| 461 |
+
"visual.blocks.1.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 462 |
+
"visual.blocks.1.norm1.weight": "model-00001-of-00002.safetensors",
|
| 463 |
+
"visual.blocks.1.norm2.weight": "model-00001-of-00002.safetensors",
|
| 464 |
+
"visual.blocks.10.attn.proj.bias": "model-00001-of-00002.safetensors",
|
| 465 |
+
"visual.blocks.10.attn.proj.weight": "model-00001-of-00002.safetensors",
|
| 466 |
+
"visual.blocks.10.attn.qkv.bias": "model-00001-of-00002.safetensors",
|
| 467 |
+
"visual.blocks.10.attn.qkv.weight": "model-00001-of-00002.safetensors",
|
| 468 |
+
"visual.blocks.10.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
|
| 469 |
+
"visual.blocks.10.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 470 |
+
"visual.blocks.10.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
|
| 471 |
+
"visual.blocks.10.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 472 |
+
"visual.blocks.10.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
|
| 473 |
+
"visual.blocks.10.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 474 |
+
"visual.blocks.10.norm1.weight": "model-00001-of-00002.safetensors",
|
| 475 |
+
"visual.blocks.10.norm2.weight": "model-00001-of-00002.safetensors",
|
| 476 |
+
"visual.blocks.11.attn.proj.bias": "model-00001-of-00002.safetensors",
|
| 477 |
+
"visual.blocks.11.attn.proj.weight": "model-00001-of-00002.safetensors",
|
| 478 |
+
"visual.blocks.11.attn.qkv.bias": "model-00001-of-00002.safetensors",
|
| 479 |
+
"visual.blocks.11.attn.qkv.weight": "model-00001-of-00002.safetensors",
|
| 480 |
+
"visual.blocks.11.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
|
| 481 |
+
"visual.blocks.11.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 482 |
+
"visual.blocks.11.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
|
| 483 |
+
"visual.blocks.11.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 484 |
+
"visual.blocks.11.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
|
| 485 |
+
"visual.blocks.11.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 486 |
+
"visual.blocks.11.norm1.weight": "model-00001-of-00002.safetensors",
|
| 487 |
+
"visual.blocks.11.norm2.weight": "model-00001-of-00002.safetensors",
|
| 488 |
+
"visual.blocks.12.attn.proj.bias": "model-00001-of-00002.safetensors",
|
| 489 |
+
"visual.blocks.12.attn.proj.weight": "model-00001-of-00002.safetensors",
|
| 490 |
+
"visual.blocks.12.attn.qkv.bias": "model-00001-of-00002.safetensors",
|
| 491 |
+
"visual.blocks.12.attn.qkv.weight": "model-00001-of-00002.safetensors",
|
| 492 |
+
"visual.blocks.12.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
|
| 493 |
+
"visual.blocks.12.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 494 |
+
"visual.blocks.12.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
|
| 495 |
+
"visual.blocks.12.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 496 |
+
"visual.blocks.12.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
|
| 497 |
+
"visual.blocks.12.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 498 |
+
"visual.blocks.12.norm1.weight": "model-00001-of-00002.safetensors",
|
| 499 |
+
"visual.blocks.12.norm2.weight": "model-00001-of-00002.safetensors",
|
| 500 |
+
"visual.blocks.13.attn.proj.bias": "model-00001-of-00002.safetensors",
|
| 501 |
+
"visual.blocks.13.attn.proj.weight": "model-00001-of-00002.safetensors",
|
| 502 |
+
"visual.blocks.13.attn.qkv.bias": "model-00001-of-00002.safetensors",
|
| 503 |
+
"visual.blocks.13.attn.qkv.weight": "model-00001-of-00002.safetensors",
|
| 504 |
+
"visual.blocks.13.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
|
| 505 |
+
"visual.blocks.13.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 506 |
+
"visual.blocks.13.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
|
| 507 |
+
"visual.blocks.13.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 508 |
+
"visual.blocks.13.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
|
| 509 |
+
"visual.blocks.13.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 510 |
+
"visual.blocks.13.norm1.weight": "model-00001-of-00002.safetensors",
|
| 511 |
+
"visual.blocks.13.norm2.weight": "model-00001-of-00002.safetensors",
|
| 512 |
+
"visual.blocks.14.attn.proj.bias": "model-00001-of-00002.safetensors",
|
| 513 |
+
"visual.blocks.14.attn.proj.weight": "model-00001-of-00002.safetensors",
|
| 514 |
+
"visual.blocks.14.attn.qkv.bias": "model-00001-of-00002.safetensors",
|
| 515 |
+
"visual.blocks.14.attn.qkv.weight": "model-00001-of-00002.safetensors",
|
| 516 |
+
"visual.blocks.14.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
|
| 517 |
+
"visual.blocks.14.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 518 |
+
"visual.blocks.14.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
|
| 519 |
+
"visual.blocks.14.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 520 |
+
"visual.blocks.14.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
|
| 521 |
+
"visual.blocks.14.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 522 |
+
"visual.blocks.14.norm1.weight": "model-00001-of-00002.safetensors",
|
| 523 |
+
"visual.blocks.14.norm2.weight": "model-00001-of-00002.safetensors",
|
| 524 |
+
"visual.blocks.15.attn.proj.bias": "model-00001-of-00002.safetensors",
|
| 525 |
+
"visual.blocks.15.attn.proj.weight": "model-00001-of-00002.safetensors",
|
| 526 |
+
"visual.blocks.15.attn.qkv.bias": "model-00001-of-00002.safetensors",
|
| 527 |
+
"visual.blocks.15.attn.qkv.weight": "model-00001-of-00002.safetensors",
|
| 528 |
+
"visual.blocks.15.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
|
| 529 |
+
"visual.blocks.15.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 530 |
+
"visual.blocks.15.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
|
| 531 |
+
"visual.blocks.15.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 532 |
+
"visual.blocks.15.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
|
| 533 |
+
"visual.blocks.15.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 534 |
+
"visual.blocks.15.norm1.weight": "model-00001-of-00002.safetensors",
|
| 535 |
+
"visual.blocks.15.norm2.weight": "model-00001-of-00002.safetensors",
|
| 536 |
+
"visual.blocks.16.attn.proj.bias": "model-00001-of-00002.safetensors",
|
| 537 |
+
"visual.blocks.16.attn.proj.weight": "model-00001-of-00002.safetensors",
|
| 538 |
+
"visual.blocks.16.attn.qkv.bias": "model-00001-of-00002.safetensors",
|
| 539 |
+
"visual.blocks.16.attn.qkv.weight": "model-00001-of-00002.safetensors",
|
| 540 |
+
"visual.blocks.16.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
|
| 541 |
+
"visual.blocks.16.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 542 |
+
"visual.blocks.16.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
|
| 543 |
+
"visual.blocks.16.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 544 |
+
"visual.blocks.16.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
|
| 545 |
+
"visual.blocks.16.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 546 |
+
"visual.blocks.16.norm1.weight": "model-00001-of-00002.safetensors",
|
| 547 |
+
"visual.blocks.16.norm2.weight": "model-00001-of-00002.safetensors",
|
| 548 |
+
"visual.blocks.17.attn.proj.bias": "model-00001-of-00002.safetensors",
|
| 549 |
+
"visual.blocks.17.attn.proj.weight": "model-00001-of-00002.safetensors",
|
| 550 |
+
"visual.blocks.17.attn.qkv.bias": "model-00001-of-00002.safetensors",
|
| 551 |
+
"visual.blocks.17.attn.qkv.weight": "model-00001-of-00002.safetensors",
|
| 552 |
+
"visual.blocks.17.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
|
| 553 |
+
"visual.blocks.17.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 554 |
+
"visual.blocks.17.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
|
| 555 |
+
"visual.blocks.17.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 556 |
+
"visual.blocks.17.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
|
| 557 |
+
"visual.blocks.17.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 558 |
+
"visual.blocks.17.norm1.weight": "model-00001-of-00002.safetensors",
|
| 559 |
+
"visual.blocks.17.norm2.weight": "model-00001-of-00002.safetensors",
|
| 560 |
+
"visual.blocks.18.attn.proj.bias": "model-00001-of-00002.safetensors",
|
| 561 |
+
"visual.blocks.18.attn.proj.weight": "model-00001-of-00002.safetensors",
|
| 562 |
+
"visual.blocks.18.attn.qkv.bias": "model-00001-of-00002.safetensors",
|
| 563 |
+
"visual.blocks.18.attn.qkv.weight": "model-00001-of-00002.safetensors",
|
| 564 |
+
"visual.blocks.18.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
|
| 565 |
+
"visual.blocks.18.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 566 |
+
"visual.blocks.18.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
|
| 567 |
+
"visual.blocks.18.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 568 |
+
"visual.blocks.18.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
|
| 569 |
+
"visual.blocks.18.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 570 |
+
"visual.blocks.18.norm1.weight": "model-00001-of-00002.safetensors",
|
| 571 |
+
"visual.blocks.18.norm2.weight": "model-00001-of-00002.safetensors",
|
| 572 |
+
"visual.blocks.19.attn.proj.bias": "model-00001-of-00002.safetensors",
|
| 573 |
+
"visual.blocks.19.attn.proj.weight": "model-00001-of-00002.safetensors",
|
| 574 |
+
"visual.blocks.19.attn.qkv.bias": "model-00001-of-00002.safetensors",
|
| 575 |
+
"visual.blocks.19.attn.qkv.weight": "model-00001-of-00002.safetensors",
|
| 576 |
+
"visual.blocks.19.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
|
| 577 |
+
"visual.blocks.19.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 578 |
+
"visual.blocks.19.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
|
| 579 |
+
"visual.blocks.19.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 580 |
+
"visual.blocks.19.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
|
| 581 |
+
"visual.blocks.19.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 582 |
+
"visual.blocks.19.norm1.weight": "model-00001-of-00002.safetensors",
|
| 583 |
+
"visual.blocks.19.norm2.weight": "model-00001-of-00002.safetensors",
|
| 584 |
+
"visual.blocks.2.attn.proj.bias": "model-00001-of-00002.safetensors",
|
| 585 |
+
"visual.blocks.2.attn.proj.weight": "model-00001-of-00002.safetensors",
|
| 586 |
+
"visual.blocks.2.attn.qkv.bias": "model-00001-of-00002.safetensors",
|
| 587 |
+
"visual.blocks.2.attn.qkv.weight": "model-00001-of-00002.safetensors",
|
| 588 |
+
"visual.blocks.2.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
|
| 589 |
+
"visual.blocks.2.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 590 |
+
"visual.blocks.2.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
|
| 591 |
+
"visual.blocks.2.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 592 |
+
"visual.blocks.2.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
|
| 593 |
+
"visual.blocks.2.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 594 |
+
"visual.blocks.2.norm1.weight": "model-00001-of-00002.safetensors",
|
| 595 |
+
"visual.blocks.2.norm2.weight": "model-00001-of-00002.safetensors",
|
| 596 |
+
"visual.blocks.20.attn.proj.bias": "model-00001-of-00002.safetensors",
|
| 597 |
+
"visual.blocks.20.attn.proj.weight": "model-00001-of-00002.safetensors",
|
| 598 |
+
"visual.blocks.20.attn.qkv.bias": "model-00001-of-00002.safetensors",
|
| 599 |
+
"visual.blocks.20.attn.qkv.weight": "model-00001-of-00002.safetensors",
|
| 600 |
+
"visual.blocks.20.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
|
| 601 |
+
"visual.blocks.20.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 602 |
+
"visual.blocks.20.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
|
| 603 |
+
"visual.blocks.20.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 604 |
+
"visual.blocks.20.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
|
| 605 |
+
"visual.blocks.20.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 606 |
+
"visual.blocks.20.norm1.weight": "model-00001-of-00002.safetensors",
|
| 607 |
+
"visual.blocks.20.norm2.weight": "model-00001-of-00002.safetensors",
|
| 608 |
+
"visual.blocks.21.attn.proj.bias": "model-00001-of-00002.safetensors",
|
| 609 |
+
"visual.blocks.21.attn.proj.weight": "model-00001-of-00002.safetensors",
|
| 610 |
+
"visual.blocks.21.attn.qkv.bias": "model-00001-of-00002.safetensors",
|
| 611 |
+
"visual.blocks.21.attn.qkv.weight": "model-00001-of-00002.safetensors",
|
| 612 |
+
"visual.blocks.21.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
|
| 613 |
+
"visual.blocks.21.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 614 |
+
"visual.blocks.21.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
|
| 615 |
+
"visual.blocks.21.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 616 |
+
"visual.blocks.21.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
|
| 617 |
+
"visual.blocks.21.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 618 |
+
"visual.blocks.21.norm1.weight": "model-00001-of-00002.safetensors",
|
| 619 |
+
"visual.blocks.21.norm2.weight": "model-00001-of-00002.safetensors",
|
| 620 |
+
"visual.blocks.22.attn.proj.bias": "model-00001-of-00002.safetensors",
|
| 621 |
+
"visual.blocks.22.attn.proj.weight": "model-00001-of-00002.safetensors",
|
| 622 |
+
"visual.blocks.22.attn.qkv.bias": "model-00001-of-00002.safetensors",
|
| 623 |
+
"visual.blocks.22.attn.qkv.weight": "model-00001-of-00002.safetensors",
|
| 624 |
+
"visual.blocks.22.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
|
| 625 |
+
"visual.blocks.22.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 626 |
+
"visual.blocks.22.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
|
| 627 |
+
"visual.blocks.22.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 628 |
+
"visual.blocks.22.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
|
| 629 |
+
"visual.blocks.22.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 630 |
+
"visual.blocks.22.norm1.weight": "model-00001-of-00002.safetensors",
|
| 631 |
+
"visual.blocks.22.norm2.weight": "model-00001-of-00002.safetensors",
|
| 632 |
+
"visual.blocks.23.attn.proj.bias": "model-00001-of-00002.safetensors",
|
| 633 |
+
"visual.blocks.23.attn.proj.weight": "model-00001-of-00002.safetensors",
|
| 634 |
+
"visual.blocks.23.attn.qkv.bias": "model-00001-of-00002.safetensors",
|
| 635 |
+
"visual.blocks.23.attn.qkv.weight": "model-00001-of-00002.safetensors",
|
| 636 |
+
"visual.blocks.23.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
|
| 637 |
+
"visual.blocks.23.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 638 |
+
"visual.blocks.23.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
|
| 639 |
+
"visual.blocks.23.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 640 |
+
"visual.blocks.23.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
|
| 641 |
+
"visual.blocks.23.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 642 |
+
"visual.blocks.23.norm1.weight": "model-00001-of-00002.safetensors",
|
| 643 |
+
"visual.blocks.23.norm2.weight": "model-00001-of-00002.safetensors",
|
| 644 |
+
"visual.blocks.24.attn.proj.bias": "model-00001-of-00002.safetensors",
|
| 645 |
+
"visual.blocks.24.attn.proj.weight": "model-00001-of-00002.safetensors",
|
| 646 |
+
"visual.blocks.24.attn.qkv.bias": "model-00001-of-00002.safetensors",
|
| 647 |
+
"visual.blocks.24.attn.qkv.weight": "model-00001-of-00002.safetensors",
|
| 648 |
+
"visual.blocks.24.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
|
| 649 |
+
"visual.blocks.24.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 650 |
+
"visual.blocks.24.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
|
| 651 |
+
"visual.blocks.24.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 652 |
+
"visual.blocks.24.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
|
| 653 |
+
"visual.blocks.24.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 654 |
+
"visual.blocks.24.norm1.weight": "model-00001-of-00002.safetensors",
|
| 655 |
+
"visual.blocks.24.norm2.weight": "model-00001-of-00002.safetensors",
|
| 656 |
+
"visual.blocks.25.attn.proj.bias": "model-00001-of-00002.safetensors",
|
| 657 |
+
"visual.blocks.25.attn.proj.weight": "model-00001-of-00002.safetensors",
|
| 658 |
+
"visual.blocks.25.attn.qkv.bias": "model-00001-of-00002.safetensors",
|
| 659 |
+
"visual.blocks.25.attn.qkv.weight": "model-00001-of-00002.safetensors",
|
| 660 |
+
"visual.blocks.25.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
|
| 661 |
+
"visual.blocks.25.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 662 |
+
"visual.blocks.25.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
|
| 663 |
+
"visual.blocks.25.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 664 |
+
"visual.blocks.25.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
|
| 665 |
+
"visual.blocks.25.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 666 |
+
"visual.blocks.25.norm1.weight": "model-00001-of-00002.safetensors",
|
| 667 |
+
"visual.blocks.25.norm2.weight": "model-00001-of-00002.safetensors",
|
| 668 |
+
"visual.blocks.26.attn.proj.bias": "model-00001-of-00002.safetensors",
|
| 669 |
+
"visual.blocks.26.attn.proj.weight": "model-00001-of-00002.safetensors",
|
| 670 |
+
"visual.blocks.26.attn.qkv.bias": "model-00001-of-00002.safetensors",
|
| 671 |
+
"visual.blocks.26.attn.qkv.weight": "model-00001-of-00002.safetensors",
|
| 672 |
+
"visual.blocks.26.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
|
| 673 |
+
"visual.blocks.26.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 674 |
+
"visual.blocks.26.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
|
| 675 |
+
"visual.blocks.26.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 676 |
+
"visual.blocks.26.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
|
| 677 |
+
"visual.blocks.26.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 678 |
+
"visual.blocks.26.norm1.weight": "model-00001-of-00002.safetensors",
|
| 679 |
+
"visual.blocks.26.norm2.weight": "model-00001-of-00002.safetensors",
|
| 680 |
+
"visual.blocks.27.attn.proj.bias": "model-00001-of-00002.safetensors",
|
| 681 |
+
"visual.blocks.27.attn.proj.weight": "model-00001-of-00002.safetensors",
|
| 682 |
+
"visual.blocks.27.attn.qkv.bias": "model-00001-of-00002.safetensors",
|
| 683 |
+
"visual.blocks.27.attn.qkv.weight": "model-00001-of-00002.safetensors",
|
| 684 |
+
"visual.blocks.27.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
|
| 685 |
+
"visual.blocks.27.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 686 |
+
"visual.blocks.27.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
|
| 687 |
+
"visual.blocks.27.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 688 |
+
"visual.blocks.27.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
|
| 689 |
+
"visual.blocks.27.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 690 |
+
"visual.blocks.27.norm1.weight": "model-00001-of-00002.safetensors",
|
| 691 |
+
"visual.blocks.27.norm2.weight": "model-00001-of-00002.safetensors",
|
| 692 |
+
"visual.blocks.28.attn.proj.bias": "model-00001-of-00002.safetensors",
|
| 693 |
+
"visual.blocks.28.attn.proj.weight": "model-00001-of-00002.safetensors",
|
| 694 |
+
"visual.blocks.28.attn.qkv.bias": "model-00001-of-00002.safetensors",
|
| 695 |
+
"visual.blocks.28.attn.qkv.weight": "model-00001-of-00002.safetensors",
|
| 696 |
+
"visual.blocks.28.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
|
| 697 |
+
"visual.blocks.28.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 698 |
+
"visual.blocks.28.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
|
| 699 |
+
"visual.blocks.28.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 700 |
+
"visual.blocks.28.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
|
| 701 |
+
"visual.blocks.28.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 702 |
+
"visual.blocks.28.norm1.weight": "model-00001-of-00002.safetensors",
|
| 703 |
+
"visual.blocks.28.norm2.weight": "model-00001-of-00002.safetensors",
|
| 704 |
+
"visual.blocks.29.attn.proj.bias": "model-00001-of-00002.safetensors",
|
| 705 |
+
"visual.blocks.29.attn.proj.weight": "model-00001-of-00002.safetensors",
|
| 706 |
+
"visual.blocks.29.attn.qkv.bias": "model-00001-of-00002.safetensors",
|
| 707 |
+
"visual.blocks.29.attn.qkv.weight": "model-00001-of-00002.safetensors",
|
| 708 |
+
"visual.blocks.29.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
|
| 709 |
+
"visual.blocks.29.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 710 |
+
"visual.blocks.29.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
|
| 711 |
+
"visual.blocks.29.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 712 |
+
"visual.blocks.29.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
|
| 713 |
+
"visual.blocks.29.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 714 |
+
"visual.blocks.29.norm1.weight": "model-00001-of-00002.safetensors",
|
| 715 |
+
"visual.blocks.29.norm2.weight": "model-00001-of-00002.safetensors",
|
| 716 |
+
"visual.blocks.3.attn.proj.bias": "model-00001-of-00002.safetensors",
|
| 717 |
+
"visual.blocks.3.attn.proj.weight": "model-00001-of-00002.safetensors",
|
| 718 |
+
"visual.blocks.3.attn.qkv.bias": "model-00001-of-00002.safetensors",
|
| 719 |
+
"visual.blocks.3.attn.qkv.weight": "model-00001-of-00002.safetensors",
|
| 720 |
+
"visual.blocks.3.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
|
| 721 |
+
"visual.blocks.3.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 722 |
+
"visual.blocks.3.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
|
| 723 |
+
"visual.blocks.3.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 724 |
+
"visual.blocks.3.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
|
| 725 |
+
"visual.blocks.3.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 726 |
+
"visual.blocks.3.norm1.weight": "model-00001-of-00002.safetensors",
|
| 727 |
+
"visual.blocks.3.norm2.weight": "model-00001-of-00002.safetensors",
|
| 728 |
+
"visual.blocks.30.attn.proj.bias": "model-00001-of-00002.safetensors",
|
| 729 |
+
"visual.blocks.30.attn.proj.weight": "model-00001-of-00002.safetensors",
|
| 730 |
+
"visual.blocks.30.attn.qkv.bias": "model-00001-of-00002.safetensors",
|
| 731 |
+
"visual.blocks.30.attn.qkv.weight": "model-00001-of-00002.safetensors",
|
| 732 |
+
"visual.blocks.30.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
|
| 733 |
+
"visual.blocks.30.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 734 |
+
"visual.blocks.30.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
|
| 735 |
+
"visual.blocks.30.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 736 |
+
"visual.blocks.30.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
|
| 737 |
+
"visual.blocks.30.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 738 |
+
"visual.blocks.30.norm1.weight": "model-00001-of-00002.safetensors",
|
| 739 |
+
"visual.blocks.30.norm2.weight": "model-00001-of-00002.safetensors",
|
| 740 |
+
"visual.blocks.31.attn.proj.bias": "model-00001-of-00002.safetensors",
|
| 741 |
+
"visual.blocks.31.attn.proj.weight": "model-00001-of-00002.safetensors",
|
| 742 |
+
"visual.blocks.31.attn.qkv.bias": "model-00001-of-00002.safetensors",
|
| 743 |
+
"visual.blocks.31.attn.qkv.weight": "model-00001-of-00002.safetensors",
|
| 744 |
+
"visual.blocks.31.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
|
| 745 |
+
"visual.blocks.31.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 746 |
+
"visual.blocks.31.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
|
| 747 |
+
"visual.blocks.31.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 748 |
+
"visual.blocks.31.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
|
| 749 |
+
"visual.blocks.31.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 750 |
+
"visual.blocks.31.norm1.weight": "model-00001-of-00002.safetensors",
|
| 751 |
+
"visual.blocks.31.norm2.weight": "model-00001-of-00002.safetensors",
|
| 752 |
+
"visual.blocks.4.attn.proj.bias": "model-00001-of-00002.safetensors",
|
| 753 |
+
"visual.blocks.4.attn.proj.weight": "model-00001-of-00002.safetensors",
|
| 754 |
+
"visual.blocks.4.attn.qkv.bias": "model-00001-of-00002.safetensors",
|
| 755 |
+
"visual.blocks.4.attn.qkv.weight": "model-00001-of-00002.safetensors",
|
| 756 |
+
"visual.blocks.4.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
|
| 757 |
+
"visual.blocks.4.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 758 |
+
"visual.blocks.4.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
|
| 759 |
+
"visual.blocks.4.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 760 |
+
"visual.blocks.4.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
|
| 761 |
+
"visual.blocks.4.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 762 |
+
"visual.blocks.4.norm1.weight": "model-00001-of-00002.safetensors",
|
| 763 |
+
"visual.blocks.4.norm2.weight": "model-00001-of-00002.safetensors",
|
| 764 |
+
"visual.blocks.5.attn.proj.bias": "model-00001-of-00002.safetensors",
|
| 765 |
+
"visual.blocks.5.attn.proj.weight": "model-00001-of-00002.safetensors",
|
| 766 |
+
"visual.blocks.5.attn.qkv.bias": "model-00001-of-00002.safetensors",
|
| 767 |
+
"visual.blocks.5.attn.qkv.weight": "model-00001-of-00002.safetensors",
|
| 768 |
+
"visual.blocks.5.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
|
| 769 |
+
"visual.blocks.5.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 770 |
+
"visual.blocks.5.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
|
| 771 |
+
"visual.blocks.5.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 772 |
+
"visual.blocks.5.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
|
| 773 |
+
"visual.blocks.5.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 774 |
+
"visual.blocks.5.norm1.weight": "model-00001-of-00002.safetensors",
|
| 775 |
+
"visual.blocks.5.norm2.weight": "model-00001-of-00002.safetensors",
|
| 776 |
+
"visual.blocks.6.attn.proj.bias": "model-00001-of-00002.safetensors",
|
| 777 |
+
"visual.blocks.6.attn.proj.weight": "model-00001-of-00002.safetensors",
|
| 778 |
+
"visual.blocks.6.attn.qkv.bias": "model-00001-of-00002.safetensors",
|
| 779 |
+
"visual.blocks.6.attn.qkv.weight": "model-00001-of-00002.safetensors",
|
| 780 |
+
"visual.blocks.6.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
|
| 781 |
+
"visual.blocks.6.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 782 |
+
"visual.blocks.6.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
|
| 783 |
+
"visual.blocks.6.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 784 |
+
"visual.blocks.6.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
|
| 785 |
+
"visual.blocks.6.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 786 |
+
"visual.blocks.6.norm1.weight": "model-00001-of-00002.safetensors",
|
| 787 |
+
"visual.blocks.6.norm2.weight": "model-00001-of-00002.safetensors",
|
| 788 |
+
"visual.blocks.7.attn.proj.bias": "model-00001-of-00002.safetensors",
|
| 789 |
+
"visual.blocks.7.attn.proj.weight": "model-00001-of-00002.safetensors",
|
| 790 |
+
"visual.blocks.7.attn.qkv.bias": "model-00001-of-00002.safetensors",
|
| 791 |
+
"visual.blocks.7.attn.qkv.weight": "model-00001-of-00002.safetensors",
|
| 792 |
+
"visual.blocks.7.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
|
| 793 |
+
"visual.blocks.7.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 794 |
+
"visual.blocks.7.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
|
| 795 |
+
"visual.blocks.7.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 796 |
+
"visual.blocks.7.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
|
| 797 |
+
"visual.blocks.7.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 798 |
+
"visual.blocks.7.norm1.weight": "model-00001-of-00002.safetensors",
|
| 799 |
+
"visual.blocks.7.norm2.weight": "model-00001-of-00002.safetensors",
|
| 800 |
+
"visual.blocks.8.attn.proj.bias": "model-00001-of-00002.safetensors",
|
| 801 |
+
"visual.blocks.8.attn.proj.weight": "model-00001-of-00002.safetensors",
|
| 802 |
+
"visual.blocks.8.attn.qkv.bias": "model-00001-of-00002.safetensors",
|
| 803 |
+
"visual.blocks.8.attn.qkv.weight": "model-00001-of-00002.safetensors",
|
| 804 |
+
"visual.blocks.8.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
|
| 805 |
+
"visual.blocks.8.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 806 |
+
"visual.blocks.8.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
|
| 807 |
+
"visual.blocks.8.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 808 |
+
"visual.blocks.8.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
|
| 809 |
+
"visual.blocks.8.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 810 |
+
"visual.blocks.8.norm1.weight": "model-00001-of-00002.safetensors",
|
| 811 |
+
"visual.blocks.8.norm2.weight": "model-00001-of-00002.safetensors",
|
| 812 |
+
"visual.blocks.9.attn.proj.bias": "model-00001-of-00002.safetensors",
|
| 813 |
+
"visual.blocks.9.attn.proj.weight": "model-00001-of-00002.safetensors",
|
| 814 |
+
"visual.blocks.9.attn.qkv.bias": "model-00001-of-00002.safetensors",
|
| 815 |
+
"visual.blocks.9.attn.qkv.weight": "model-00001-of-00002.safetensors",
|
| 816 |
+
"visual.blocks.9.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
|
| 817 |
+
"visual.blocks.9.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 818 |
+
"visual.blocks.9.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
|
| 819 |
+
"visual.blocks.9.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 820 |
+
"visual.blocks.9.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
|
| 821 |
+
"visual.blocks.9.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 822 |
+
"visual.blocks.9.norm1.weight": "model-00001-of-00002.safetensors",
|
| 823 |
+
"visual.blocks.9.norm2.weight": "model-00001-of-00002.safetensors",
|
| 824 |
+
"visual.merger.ln_q.weight": "model-00001-of-00002.safetensors",
|
| 825 |
+
"visual.merger.mlp.0.bias": "model-00001-of-00002.safetensors",
|
| 826 |
+
"visual.merger.mlp.0.weight": "model-00001-of-00002.safetensors",
|
| 827 |
+
"visual.merger.mlp.2.bias": "model-00001-of-00002.safetensors",
|
| 828 |
+
"visual.merger.mlp.2.weight": "model-00001-of-00002.safetensors",
|
| 829 |
+
"visual.patch_embed.proj.weight": "model-00001-of-00002.safetensors"
|
| 830 |
+
}
|
| 831 |
+
}
|
EXP_1.1_3b/preprocessor_config.json
ADDED
|
@@ -0,0 +1,29 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"do_convert_rgb": true,
|
| 3 |
+
"do_normalize": true,
|
| 4 |
+
"do_rescale": true,
|
| 5 |
+
"do_resize": true,
|
| 6 |
+
"image_mean": [
|
| 7 |
+
0.48145466,
|
| 8 |
+
0.4578275,
|
| 9 |
+
0.40821073
|
| 10 |
+
],
|
| 11 |
+
"image_processor_type": "Qwen2VLImageProcessor",
|
| 12 |
+
"image_std": [
|
| 13 |
+
0.26862954,
|
| 14 |
+
0.26130258,
|
| 15 |
+
0.27577711
|
| 16 |
+
],
|
| 17 |
+
"max_pixels": 12845056,
|
| 18 |
+
"merge_size": 2,
|
| 19 |
+
"min_pixels": 3136,
|
| 20 |
+
"patch_size": 14,
|
| 21 |
+
"processor_class": "Qwen2_5_VLProcessor",
|
| 22 |
+
"resample": 3,
|
| 23 |
+
"rescale_factor": 0.00392156862745098,
|
| 24 |
+
"size": {
|
| 25 |
+
"longest_edge": 12845056,
|
| 26 |
+
"shortest_edge": 3136
|
| 27 |
+
},
|
| 28 |
+
"temporal_patch_size": 2
|
| 29 |
+
}
|
EXP_1.1_3b/special_tokens_map.json
ADDED
|
@@ -0,0 +1,31 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"additional_special_tokens": [
|
| 3 |
+
"<|im_start|>",
|
| 4 |
+
"<|im_end|>",
|
| 5 |
+
"<|object_ref_start|>",
|
| 6 |
+
"<|object_ref_end|>",
|
| 7 |
+
"<|box_start|>",
|
| 8 |
+
"<|box_end|>",
|
| 9 |
+
"<|quad_start|>",
|
| 10 |
+
"<|quad_end|>",
|
| 11 |
+
"<|vision_start|>",
|
| 12 |
+
"<|vision_end|>",
|
| 13 |
+
"<|vision_pad|>",
|
| 14 |
+
"<|image_pad|>",
|
| 15 |
+
"<|video_pad|>"
|
| 16 |
+
],
|
| 17 |
+
"eos_token": {
|
| 18 |
+
"content": "<|im_end|>",
|
| 19 |
+
"lstrip": false,
|
| 20 |
+
"normalized": false,
|
| 21 |
+
"rstrip": false,
|
| 22 |
+
"single_word": false
|
| 23 |
+
},
|
| 24 |
+
"pad_token": {
|
| 25 |
+
"content": "<|endoftext|>",
|
| 26 |
+
"lstrip": false,
|
| 27 |
+
"normalized": false,
|
| 28 |
+
"rstrip": false,
|
| 29 |
+
"single_word": false
|
| 30 |
+
}
|
| 31 |
+
}
|
EXP_1.1_3b/swanlab_public_config.json
ADDED
|
@@ -0,0 +1,13 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"project_name": "LLaMA-Factory",
|
| 3 |
+
"version": "0.6.1",
|
| 4 |
+
"run_id": "run-20250610_130347-a3b1799d",
|
| 5 |
+
"swanlog_dir": "/mnt/nvme/hyz/LLaMA-Factory/swanlog",
|
| 6 |
+
"run_dir": "/mnt/nvme/hyz/LLaMA-Factory/swanlog/run-20250610_130347-a3b1799d",
|
| 7 |
+
"cloud": {
|
| 8 |
+
"project_name": "LLaMA-Factory",
|
| 9 |
+
"project_url": "https://swanlab.cn/@huyuanze/LLaMA-Factory",
|
| 10 |
+
"experiment_name": "/mnt/nvme/hyz/mm_homework/checkpoints_full/EXP_1.1_3b",
|
| 11 |
+
"experiment_url": "https://swanlab.cn/@huyuanze/LLaMA-Factory/runs/s4avvpm017ngll7xxeef0"
|
| 12 |
+
}
|
| 13 |
+
}
|
EXP_1.1_3b/tokenizer.json
ADDED
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:9c5ae00e602b8860cbd784ba82a8aa14e8feecec692e7076590d014d7b7fdafa
|
| 3 |
+
size 11421896
|
EXP_1.1_3b/tokenizer_config.json
ADDED
|
@@ -0,0 +1,209 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"add_bos_token": false,
|
| 3 |
+
"add_prefix_space": false,
|
| 4 |
+
"added_tokens_decoder": {
|
| 5 |
+
"151643": {
|
| 6 |
+
"content": "<|endoftext|>",
|
| 7 |
+
"lstrip": false,
|
| 8 |
+
"normalized": false,
|
| 9 |
+
"rstrip": false,
|
| 10 |
+
"single_word": false,
|
| 11 |
+
"special": true
|
| 12 |
+
},
|
| 13 |
+
"151644": {
|
| 14 |
+
"content": "<|im_start|>",
|
| 15 |
+
"lstrip": false,
|
| 16 |
+
"normalized": false,
|
| 17 |
+
"rstrip": false,
|
| 18 |
+
"single_word": false,
|
| 19 |
+
"special": true
|
| 20 |
+
},
|
| 21 |
+
"151645": {
|
| 22 |
+
"content": "<|im_end|>",
|
| 23 |
+
"lstrip": false,
|
| 24 |
+
"normalized": false,
|
| 25 |
+
"rstrip": false,
|
| 26 |
+
"single_word": false,
|
| 27 |
+
"special": true
|
| 28 |
+
},
|
| 29 |
+
"151646": {
|
| 30 |
+
"content": "<|object_ref_start|>",
|
| 31 |
+
"lstrip": false,
|
| 32 |
+
"normalized": false,
|
| 33 |
+
"rstrip": false,
|
| 34 |
+
"single_word": false,
|
| 35 |
+
"special": true
|
| 36 |
+
},
|
| 37 |
+
"151647": {
|
| 38 |
+
"content": "<|object_ref_end|>",
|
| 39 |
+
"lstrip": false,
|
| 40 |
+
"normalized": false,
|
| 41 |
+
"rstrip": false,
|
| 42 |
+
"single_word": false,
|
| 43 |
+
"special": true
|
| 44 |
+
},
|
| 45 |
+
"151648": {
|
| 46 |
+
"content": "<|box_start|>",
|
| 47 |
+
"lstrip": false,
|
| 48 |
+
"normalized": false,
|
| 49 |
+
"rstrip": false,
|
| 50 |
+
"single_word": false,
|
| 51 |
+
"special": true
|
| 52 |
+
},
|
| 53 |
+
"151649": {
|
| 54 |
+
"content": "<|box_end|>",
|
| 55 |
+
"lstrip": false,
|
| 56 |
+
"normalized": false,
|
| 57 |
+
"rstrip": false,
|
| 58 |
+
"single_word": false,
|
| 59 |
+
"special": true
|
| 60 |
+
},
|
| 61 |
+
"151650": {
|
| 62 |
+
"content": "<|quad_start|>",
|
| 63 |
+
"lstrip": false,
|
| 64 |
+
"normalized": false,
|
| 65 |
+
"rstrip": false,
|
| 66 |
+
"single_word": false,
|
| 67 |
+
"special": true
|
| 68 |
+
},
|
| 69 |
+
"151651": {
|
| 70 |
+
"content": "<|quad_end|>",
|
| 71 |
+
"lstrip": false,
|
| 72 |
+
"normalized": false,
|
| 73 |
+
"rstrip": false,
|
| 74 |
+
"single_word": false,
|
| 75 |
+
"special": true
|
| 76 |
+
},
|
| 77 |
+
"151652": {
|
| 78 |
+
"content": "<|vision_start|>",
|
| 79 |
+
"lstrip": false,
|
| 80 |
+
"normalized": false,
|
| 81 |
+
"rstrip": false,
|
| 82 |
+
"single_word": false,
|
| 83 |
+
"special": true
|
| 84 |
+
},
|
| 85 |
+
"151653": {
|
| 86 |
+
"content": "<|vision_end|>",
|
| 87 |
+
"lstrip": false,
|
| 88 |
+
"normalized": false,
|
| 89 |
+
"rstrip": false,
|
| 90 |
+
"single_word": false,
|
| 91 |
+
"special": true
|
| 92 |
+
},
|
| 93 |
+
"151654": {
|
| 94 |
+
"content": "<|vision_pad|>",
|
| 95 |
+
"lstrip": false,
|
| 96 |
+
"normalized": false,
|
| 97 |
+
"rstrip": false,
|
| 98 |
+
"single_word": false,
|
| 99 |
+
"special": true
|
| 100 |
+
},
|
| 101 |
+
"151655": {
|
| 102 |
+
"content": "<|image_pad|>",
|
| 103 |
+
"lstrip": false,
|
| 104 |
+
"normalized": false,
|
| 105 |
+
"rstrip": false,
|
| 106 |
+
"single_word": false,
|
| 107 |
+
"special": true
|
| 108 |
+
},
|
| 109 |
+
"151656": {
|
| 110 |
+
"content": "<|video_pad|>",
|
| 111 |
+
"lstrip": false,
|
| 112 |
+
"normalized": false,
|
| 113 |
+
"rstrip": false,
|
| 114 |
+
"single_word": false,
|
| 115 |
+
"special": true
|
| 116 |
+
},
|
| 117 |
+
"151657": {
|
| 118 |
+
"content": "<tool_call>",
|
| 119 |
+
"lstrip": false,
|
| 120 |
+
"normalized": false,
|
| 121 |
+
"rstrip": false,
|
| 122 |
+
"single_word": false,
|
| 123 |
+
"special": false
|
| 124 |
+
},
|
| 125 |
+
"151658": {
|
| 126 |
+
"content": "</tool_call>",
|
| 127 |
+
"lstrip": false,
|
| 128 |
+
"normalized": false,
|
| 129 |
+
"rstrip": false,
|
| 130 |
+
"single_word": false,
|
| 131 |
+
"special": false
|
| 132 |
+
},
|
| 133 |
+
"151659": {
|
| 134 |
+
"content": "<|fim_prefix|>",
|
| 135 |
+
"lstrip": false,
|
| 136 |
+
"normalized": false,
|
| 137 |
+
"rstrip": false,
|
| 138 |
+
"single_word": false,
|
| 139 |
+
"special": false
|
| 140 |
+
},
|
| 141 |
+
"151660": {
|
| 142 |
+
"content": "<|fim_middle|>",
|
| 143 |
+
"lstrip": false,
|
| 144 |
+
"normalized": false,
|
| 145 |
+
"rstrip": false,
|
| 146 |
+
"single_word": false,
|
| 147 |
+
"special": false
|
| 148 |
+
},
|
| 149 |
+
"151661": {
|
| 150 |
+
"content": "<|fim_suffix|>",
|
| 151 |
+
"lstrip": false,
|
| 152 |
+
"normalized": false,
|
| 153 |
+
"rstrip": false,
|
| 154 |
+
"single_word": false,
|
| 155 |
+
"special": false
|
| 156 |
+
},
|
| 157 |
+
"151662": {
|
| 158 |
+
"content": "<|fim_pad|>",
|
| 159 |
+
"lstrip": false,
|
| 160 |
+
"normalized": false,
|
| 161 |
+
"rstrip": false,
|
| 162 |
+
"single_word": false,
|
| 163 |
+
"special": false
|
| 164 |
+
},
|
| 165 |
+
"151663": {
|
| 166 |
+
"content": "<|repo_name|>",
|
| 167 |
+
"lstrip": false,
|
| 168 |
+
"normalized": false,
|
| 169 |
+
"rstrip": false,
|
| 170 |
+
"single_word": false,
|
| 171 |
+
"special": false
|
| 172 |
+
},
|
| 173 |
+
"151664": {
|
| 174 |
+
"content": "<|file_sep|>",
|
| 175 |
+
"lstrip": false,
|
| 176 |
+
"normalized": false,
|
| 177 |
+
"rstrip": false,
|
| 178 |
+
"single_word": false,
|
| 179 |
+
"special": false
|
| 180 |
+
}
|
| 181 |
+
},
|
| 182 |
+
"additional_special_tokens": [
|
| 183 |
+
"<|im_start|>",
|
| 184 |
+
"<|im_end|>",
|
| 185 |
+
"<|object_ref_start|>",
|
| 186 |
+
"<|object_ref_end|>",
|
| 187 |
+
"<|box_start|>",
|
| 188 |
+
"<|box_end|>",
|
| 189 |
+
"<|quad_start|>",
|
| 190 |
+
"<|quad_end|>",
|
| 191 |
+
"<|vision_start|>",
|
| 192 |
+
"<|vision_end|>",
|
| 193 |
+
"<|vision_pad|>",
|
| 194 |
+
"<|image_pad|>",
|
| 195 |
+
"<|video_pad|>"
|
| 196 |
+
],
|
| 197 |
+
"bos_token": null,
|
| 198 |
+
"clean_up_tokenization_spaces": false,
|
| 199 |
+
"eos_token": "<|im_end|>",
|
| 200 |
+
"errors": "replace",
|
| 201 |
+
"extra_special_tokens": {},
|
| 202 |
+
"model_max_length": 131072,
|
| 203 |
+
"pad_token": "<|endoftext|>",
|
| 204 |
+
"padding_side": "right",
|
| 205 |
+
"processor_class": "Qwen2_5_VLProcessor",
|
| 206 |
+
"split_special_tokens": false,
|
| 207 |
+
"tokenizer_class": "Qwen2Tokenizer",
|
| 208 |
+
"unk_token": null
|
| 209 |
+
}
|
EXP_1.1_3b/train_results.json
ADDED
|
@@ -0,0 +1,8 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"epoch": 3.0,
|
| 3 |
+
"total_flos": 598743726292992.0,
|
| 4 |
+
"train_loss": 0.6124914906099317,
|
| 5 |
+
"train_runtime": 9399.7215,
|
| 6 |
+
"train_samples_per_second": 16.571,
|
| 7 |
+
"train_steps_per_second": 0.259
|
| 8 |
+
}
|
EXP_1.1_3b/trainer_log.jsonl
ADDED
|
@@ -0,0 +1,248 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{"current_steps": 10, "total_steps": 2436, "loss": 1.5881, "lr": 3.6885245901639347e-07, "epoch": 0.012322858903265557, "percentage": 0.41, "elapsed_time": "0:00:38", "remaining_time": "2:35:42"}
|
| 2 |
+
{"current_steps": 20, "total_steps": 2436, "loss": 1.6078, "lr": 7.78688524590164e-07, "epoch": 0.024645717806531114, "percentage": 0.82, "elapsed_time": "0:01:10", "remaining_time": "2:22:11"}
|
| 3 |
+
{"current_steps": 30, "total_steps": 2436, "loss": 1.5155, "lr": 1.1885245901639345e-06, "epoch": 0.036968576709796676, "percentage": 1.23, "elapsed_time": "0:01:44", "remaining_time": "2:19:57"}
|
| 4 |
+
{"current_steps": 40, "total_steps": 2436, "loss": 1.3049, "lr": 1.5983606557377053e-06, "epoch": 0.04929143561306223, "percentage": 1.64, "elapsed_time": "0:02:18", "remaining_time": "2:18:25"}
|
| 5 |
+
{"current_steps": 50, "total_steps": 2436, "loss": 1.1069, "lr": 2.0081967213114756e-06, "epoch": 0.061614294516327786, "percentage": 2.05, "elapsed_time": "0:02:52", "remaining_time": "2:16:49"}
|
| 6 |
+
{"current_steps": 60, "total_steps": 2436, "loss": 0.9964, "lr": 2.418032786885246e-06, "epoch": 0.07393715341959335, "percentage": 2.46, "elapsed_time": "0:03:25", "remaining_time": "2:15:39"}
|
| 7 |
+
{"current_steps": 70, "total_steps": 2436, "loss": 0.9526, "lr": 2.8278688524590166e-06, "epoch": 0.0862600123228589, "percentage": 2.87, "elapsed_time": "0:04:01", "remaining_time": "2:15:51"}
|
| 8 |
+
{"current_steps": 80, "total_steps": 2436, "loss": 0.9071, "lr": 3.2377049180327876e-06, "epoch": 0.09858287122612445, "percentage": 3.28, "elapsed_time": "0:04:33", "remaining_time": "2:14:24"}
|
| 9 |
+
{"current_steps": 90, "total_steps": 2436, "loss": 0.8833, "lr": 3.6475409836065577e-06, "epoch": 0.11090573012939002, "percentage": 3.69, "elapsed_time": "0:05:07", "remaining_time": "2:13:34"}
|
| 10 |
+
{"current_steps": 100, "total_steps": 2436, "loss": 0.8617, "lr": 4.057377049180329e-06, "epoch": 0.12322858903265557, "percentage": 4.11, "elapsed_time": "0:05:43", "remaining_time": "2:13:41"}
|
| 11 |
+
{"current_steps": 110, "total_steps": 2436, "loss": 0.8439, "lr": 4.467213114754098e-06, "epoch": 0.13555144793592114, "percentage": 4.52, "elapsed_time": "0:06:18", "remaining_time": "2:13:27"}
|
| 12 |
+
{"current_steps": 120, "total_steps": 2436, "loss": 0.8494, "lr": 4.877049180327869e-06, "epoch": 0.1478743068391867, "percentage": 4.93, "elapsed_time": "0:06:54", "remaining_time": "2:13:27"}
|
| 13 |
+
{"current_steps": 130, "total_steps": 2436, "loss": 0.8114, "lr": 5.286885245901639e-06, "epoch": 0.16019716574245224, "percentage": 5.34, "elapsed_time": "0:07:28", "remaining_time": "2:12:29"}
|
| 14 |
+
{"current_steps": 140, "total_steps": 2436, "loss": 0.7865, "lr": 5.696721311475411e-06, "epoch": 0.1725200246457178, "percentage": 5.75, "elapsed_time": "0:08:01", "remaining_time": "2:11:41"}
|
| 15 |
+
{"current_steps": 150, "total_steps": 2436, "loss": 0.8102, "lr": 6.10655737704918e-06, "epoch": 0.18484288354898337, "percentage": 6.16, "elapsed_time": "0:08:33", "remaining_time": "2:10:30"}
|
| 16 |
+
{"current_steps": 160, "total_steps": 2436, "loss": 0.7978, "lr": 6.516393442622952e-06, "epoch": 0.1971657424522489, "percentage": 6.57, "elapsed_time": "0:09:08", "remaining_time": "2:10:06"}
|
| 17 |
+
{"current_steps": 170, "total_steps": 2436, "loss": 0.7969, "lr": 6.926229508196722e-06, "epoch": 0.20948860135551448, "percentage": 6.98, "elapsed_time": "0:09:44", "remaining_time": "2:09:56"}
|
| 18 |
+
{"current_steps": 180, "total_steps": 2436, "loss": 0.7821, "lr": 7.336065573770492e-06, "epoch": 0.22181146025878004, "percentage": 7.39, "elapsed_time": "0:10:19", "remaining_time": "2:09:18"}
|
| 19 |
+
{"current_steps": 190, "total_steps": 2436, "loss": 0.7654, "lr": 7.745901639344263e-06, "epoch": 0.2341343191620456, "percentage": 7.8, "elapsed_time": "0:10:53", "remaining_time": "2:08:45"}
|
| 20 |
+
{"current_steps": 200, "total_steps": 2436, "loss": 0.7924, "lr": 8.155737704918034e-06, "epoch": 0.24645717806531114, "percentage": 8.21, "elapsed_time": "0:11:28", "remaining_time": "2:08:18"}
|
| 21 |
+
{"current_steps": 210, "total_steps": 2436, "loss": 0.7754, "lr": 8.565573770491804e-06, "epoch": 0.2587800369685767, "percentage": 8.62, "elapsed_time": "0:12:02", "remaining_time": "2:07:38"}
|
| 22 |
+
{"current_steps": 220, "total_steps": 2436, "loss": 0.7685, "lr": 8.975409836065575e-06, "epoch": 0.2711028958718423, "percentage": 9.03, "elapsed_time": "0:12:34", "remaining_time": "2:06:44"}
|
| 23 |
+
{"current_steps": 230, "total_steps": 2436, "loss": 0.766, "lr": 9.385245901639345e-06, "epoch": 0.28342575477510784, "percentage": 9.44, "elapsed_time": "0:13:09", "remaining_time": "2:06:13"}
|
| 24 |
+
{"current_steps": 240, "total_steps": 2436, "loss": 0.7709, "lr": 9.795081967213116e-06, "epoch": 0.2957486136783734, "percentage": 9.85, "elapsed_time": "0:13:42", "remaining_time": "2:05:24"}
|
| 25 |
+
{"current_steps": 250, "total_steps": 2436, "loss": 0.7434, "lr": 9.999871620167532e-06, "epoch": 0.3080714725816389, "percentage": 10.26, "elapsed_time": "0:14:15", "remaining_time": "2:04:41"}
|
| 26 |
+
{"current_steps": 260, "total_steps": 2436, "loss": 0.7559, "lr": 9.998844621062755e-06, "epoch": 0.3203943314849045, "percentage": 10.67, "elapsed_time": "0:14:50", "remaining_time": "2:04:11"}
|
| 27 |
+
{"current_steps": 270, "total_steps": 2436, "loss": 0.7541, "lr": 9.996790833804053e-06, "epoch": 0.33271719038817005, "percentage": 11.08, "elapsed_time": "0:15:22", "remaining_time": "2:03:24"}
|
| 28 |
+
{"current_steps": 280, "total_steps": 2436, "loss": 0.7565, "lr": 9.993710680249788e-06, "epoch": 0.3450400492914356, "percentage": 11.49, "elapsed_time": "0:15:53", "remaining_time": "2:02:23"}
|
| 29 |
+
{"current_steps": 290, "total_steps": 2436, "loss": 0.7488, "lr": 9.989604793079198e-06, "epoch": 0.3573629081947012, "percentage": 11.9, "elapsed_time": "0:16:26", "remaining_time": "2:01:42"}
|
| 30 |
+
{"current_steps": 300, "total_steps": 2436, "loss": 0.7542, "lr": 9.984474015662421e-06, "epoch": 0.36968576709796674, "percentage": 12.32, "elapsed_time": "0:17:00", "remaining_time": "2:01:04"}
|
| 31 |
+
{"current_steps": 310, "total_steps": 2436, "loss": 0.7626, "lr": 9.978319401887287e-06, "epoch": 0.3820086260012323, "percentage": 12.73, "elapsed_time": "0:17:32", "remaining_time": "2:00:16"}
|
| 32 |
+
{"current_steps": 320, "total_steps": 2436, "loss": 0.7445, "lr": 9.971142215942817e-06, "epoch": 0.3943314849044978, "percentage": 13.14, "elapsed_time": "0:18:06", "remaining_time": "1:59:46"}
|
| 33 |
+
{"current_steps": 330, "total_steps": 2436, "loss": 0.7438, "lr": 9.962943932059573e-06, "epoch": 0.4066543438077634, "percentage": 13.55, "elapsed_time": "0:18:41", "remaining_time": "1:59:17"}
|
| 34 |
+
{"current_steps": 340, "total_steps": 2436, "loss": 0.7403, "lr": 9.953726234206835e-06, "epoch": 0.41897720271102895, "percentage": 13.96, "elapsed_time": "0:19:13", "remaining_time": "1:58:33"}
|
| 35 |
+
{"current_steps": 350, "total_steps": 2436, "loss": 0.7459, "lr": 9.943491015746704e-06, "epoch": 0.4313000616142945, "percentage": 14.37, "elapsed_time": "0:19:47", "remaining_time": "1:57:56"}
|
| 36 |
+
{"current_steps": 360, "total_steps": 2436, "loss": 0.7361, "lr": 9.9322403790452e-06, "epoch": 0.4436229205175601, "percentage": 14.78, "elapsed_time": "0:20:22", "remaining_time": "1:57:32"}
|
| 37 |
+
{"current_steps": 370, "total_steps": 2436, "loss": 0.7373, "lr": 9.919976635040425e-06, "epoch": 0.45594577942082565, "percentage": 15.19, "elapsed_time": "0:20:55", "remaining_time": "1:56:49"}
|
| 38 |
+
{"current_steps": 380, "total_steps": 2436, "loss": 0.7279, "lr": 9.906702302767876e-06, "epoch": 0.4682686383240912, "percentage": 15.6, "elapsed_time": "0:21:29", "remaining_time": "1:56:14"}
|
| 39 |
+
{"current_steps": 390, "total_steps": 2436, "loss": 0.7503, "lr": 9.892420108843038e-06, "epoch": 0.4805914972273567, "percentage": 16.01, "elapsed_time": "0:22:03", "remaining_time": "1:55:43"}
|
| 40 |
+
{"current_steps": 400, "total_steps": 2436, "loss": 0.7528, "lr": 9.877132986901306e-06, "epoch": 0.4929143561306223, "percentage": 16.42, "elapsed_time": "0:22:35", "remaining_time": "1:55:00"}
|
| 41 |
+
{"current_steps": 410, "total_steps": 2436, "loss": 0.7472, "lr": 9.860844076995416e-06, "epoch": 0.5052372150338879, "percentage": 16.83, "elapsed_time": "0:23:10", "remaining_time": "1:54:28"}
|
| 42 |
+
{"current_steps": 420, "total_steps": 2436, "loss": 0.7338, "lr": 9.843556724950454e-06, "epoch": 0.5175600739371534, "percentage": 17.24, "elapsed_time": "0:23:44", "remaining_time": "1:53:57"}
|
| 43 |
+
{"current_steps": 430, "total_steps": 2436, "loss": 0.716, "lr": 9.825274481676605e-06, "epoch": 0.529882932840419, "percentage": 17.65, "elapsed_time": "0:24:18", "remaining_time": "1:53:25"}
|
| 44 |
+
{"current_steps": 440, "total_steps": 2436, "loss": 0.7381, "lr": 9.806001102439789e-06, "epoch": 0.5422057917436846, "percentage": 18.06, "elapsed_time": "0:24:49", "remaining_time": "1:52:37"}
|
| 45 |
+
{"current_steps": 450, "total_steps": 2436, "loss": 0.72, "lr": 9.785740546090293e-06, "epoch": 0.5545286506469501, "percentage": 18.47, "elapsed_time": "0:25:22", "remaining_time": "1:51:58"}
|
| 46 |
+
{"current_steps": 460, "total_steps": 2436, "loss": 0.7302, "lr": 9.76449697424962e-06, "epoch": 0.5668515095502157, "percentage": 18.88, "elapsed_time": "0:25:57", "remaining_time": "1:51:29"}
|
| 47 |
+
{"current_steps": 470, "total_steps": 2436, "loss": 0.7165, "lr": 9.742274750455659e-06, "epoch": 0.5791743684534812, "percentage": 19.29, "elapsed_time": "0:26:30", "remaining_time": "1:50:53"}
|
| 48 |
+
{"current_steps": 480, "total_steps": 2436, "loss": 0.715, "lr": 9.719078439266399e-06, "epoch": 0.5914972273567468, "percentage": 19.7, "elapsed_time": "0:27:01", "remaining_time": "1:50:07"}
|
| 49 |
+
{"current_steps": 490, "total_steps": 2436, "loss": 0.7328, "lr": 9.69491280532234e-06, "epoch": 0.6038200862600123, "percentage": 20.11, "elapsed_time": "0:27:33", "remaining_time": "1:49:25"}
|
| 50 |
+
{"current_steps": 500, "total_steps": 2436, "loss": 0.705, "lr": 9.66978281236782e-06, "epoch": 0.6161429451632778, "percentage": 20.53, "elapsed_time": "0:28:09", "remaining_time": "1:49:01"}
|
| 51 |
+
{"current_steps": 500, "total_steps": 2436, "eval_loss": 0.7097772359848022, "epoch": 0.6161429451632778, "percentage": 20.53, "elapsed_time": "0:32:59", "remaining_time": "2:07:42"}
|
| 52 |
+
{"current_steps": 510, "total_steps": 2436, "loss": 0.7196, "lr": 9.643693622231426e-06, "epoch": 0.6284658040665434, "percentage": 20.94, "elapsed_time": "0:33:31", "remaining_time": "2:06:36"}
|
| 53 |
+
{"current_steps": 520, "total_steps": 2436, "loss": 0.7212, "lr": 9.616650593765733e-06, "epoch": 0.640788662969809, "percentage": 21.35, "elapsed_time": "0:34:05", "remaining_time": "2:05:36"}
|
| 54 |
+
{"current_steps": 530, "total_steps": 2436, "loss": 0.7038, "lr": 9.58865928174657e-06, "epoch": 0.6531115218730745, "percentage": 21.76, "elapsed_time": "0:34:36", "remaining_time": "2:04:28"}
|
| 55 |
+
{"current_steps": 540, "total_steps": 2436, "loss": 0.7108, "lr": 9.559725435732042e-06, "epoch": 0.6654343807763401, "percentage": 22.17, "elapsed_time": "0:35:09", "remaining_time": "2:03:28"}
|
| 56 |
+
{"current_steps": 550, "total_steps": 2436, "loss": 0.7164, "lr": 9.529854998881534e-06, "epoch": 0.6777572396796057, "percentage": 22.58, "elapsed_time": "0:35:43", "remaining_time": "2:02:29"}
|
| 57 |
+
{"current_steps": 560, "total_steps": 2436, "loss": 0.7183, "lr": 9.499054106734963e-06, "epoch": 0.6900800985828712, "percentage": 22.99, "elapsed_time": "0:36:13", "remaining_time": "2:01:20"}
|
| 58 |
+
{"current_steps": 570, "total_steps": 2436, "loss": 0.7082, "lr": 9.467329085952505e-06, "epoch": 0.7024029574861368, "percentage": 23.4, "elapsed_time": "0:36:46", "remaining_time": "2:00:23"}
|
| 59 |
+
{"current_steps": 580, "total_steps": 2436, "loss": 0.7136, "lr": 9.434686453015067e-06, "epoch": 0.7147258163894024, "percentage": 23.81, "elapsed_time": "0:37:15", "remaining_time": "1:59:14"}
|
| 60 |
+
{"current_steps": 590, "total_steps": 2436, "loss": 0.7225, "lr": 9.401132912885764e-06, "epoch": 0.7270486752926679, "percentage": 24.22, "elapsed_time": "0:37:47", "remaining_time": "1:58:13"}
|
| 61 |
+
{"current_steps": 600, "total_steps": 2436, "loss": 0.7133, "lr": 9.36667535763269e-06, "epoch": 0.7393715341959335, "percentage": 24.63, "elapsed_time": "0:38:23", "remaining_time": "1:57:29"}
|
| 62 |
+
{"current_steps": 610, "total_steps": 2436, "loss": 0.7063, "lr": 9.331320865013257e-06, "epoch": 0.751694393099199, "percentage": 25.04, "elapsed_time": "0:38:56", "remaining_time": "1:56:34"}
|
| 63 |
+
{"current_steps": 620, "total_steps": 2436, "loss": 0.6986, "lr": 9.295076697020378e-06, "epoch": 0.7640172520024646, "percentage": 25.45, "elapsed_time": "0:39:29", "remaining_time": "1:55:40"}
|
| 64 |
+
{"current_steps": 630, "total_steps": 2436, "loss": 0.706, "lr": 9.257950298390815e-06, "epoch": 0.7763401109057301, "percentage": 25.86, "elapsed_time": "0:40:04", "remaining_time": "1:54:54"}
|
| 65 |
+
{"current_steps": 640, "total_steps": 2436, "loss": 0.7083, "lr": 9.219949295076006e-06, "epoch": 0.7886629698089956, "percentage": 26.27, "elapsed_time": "0:40:39", "remaining_time": "1:54:05"}
|
| 66 |
+
{"current_steps": 650, "total_steps": 2436, "loss": 0.7066, "lr": 9.181081492675645e-06, "epoch": 0.8009858287122612, "percentage": 26.68, "elapsed_time": "0:41:11", "remaining_time": "1:53:11"}
|
| 67 |
+
{"current_steps": 660, "total_steps": 2436, "loss": 0.6922, "lr": 9.141354874834372e-06, "epoch": 0.8133086876155268, "percentage": 27.09, "elapsed_time": "0:41:43", "remaining_time": "1:52:16"}
|
| 68 |
+
{"current_steps": 670, "total_steps": 2436, "loss": 0.686, "lr": 9.100777601601896e-06, "epoch": 0.8256315465187923, "percentage": 27.5, "elapsed_time": "0:42:18", "remaining_time": "1:51:31"}
|
| 69 |
+
{"current_steps": 680, "total_steps": 2436, "loss": 0.7045, "lr": 9.05935800775688e-06, "epoch": 0.8379544054220579, "percentage": 27.91, "elapsed_time": "0:42:54", "remaining_time": "1:50:47"}
|
| 70 |
+
{"current_steps": 690, "total_steps": 2436, "loss": 0.6952, "lr": 9.017104601094927e-06, "epoch": 0.8502772643253235, "percentage": 28.33, "elapsed_time": "0:43:26", "remaining_time": "1:49:56"}
|
| 71 |
+
{"current_steps": 700, "total_steps": 2436, "loss": 0.6905, "lr": 8.974026060681044e-06, "epoch": 0.862600123228589, "percentage": 28.74, "elapsed_time": "0:44:03", "remaining_time": "1:49:15"}
|
| 72 |
+
{"current_steps": 710, "total_steps": 2436, "loss": 0.7148, "lr": 8.930131235066914e-06, "epoch": 0.8749229821318546, "percentage": 29.15, "elapsed_time": "0:44:37", "remaining_time": "1:48:28"}
|
| 73 |
+
{"current_steps": 720, "total_steps": 2436, "loss": 0.693, "lr": 8.885429140473361e-06, "epoch": 0.8872458410351202, "percentage": 29.56, "elapsed_time": "0:45:10", "remaining_time": "1:47:40"}
|
| 74 |
+
{"current_steps": 730, "total_steps": 2436, "loss": 0.7023, "lr": 8.839928958938364e-06, "epoch": 0.8995686999383857, "percentage": 29.97, "elapsed_time": "0:45:43", "remaining_time": "1:46:52"}
|
| 75 |
+
{"current_steps": 740, "total_steps": 2436, "loss": 0.7014, "lr": 8.793640036431036e-06, "epoch": 0.9118915588416513, "percentage": 30.38, "elapsed_time": "0:46:15", "remaining_time": "1:46:02"}
|
| 76 |
+
{"current_steps": 750, "total_steps": 2436, "loss": 0.7153, "lr": 8.746571880931896e-06, "epoch": 0.9242144177449169, "percentage": 30.79, "elapsed_time": "0:46:50", "remaining_time": "1:45:17"}
|
| 77 |
+
{"current_steps": 760, "total_steps": 2436, "loss": 0.7031, "lr": 8.698734160479892e-06, "epoch": 0.9365372766481824, "percentage": 31.2, "elapsed_time": "0:47:23", "remaining_time": "1:44:31"}
|
| 78 |
+
{"current_steps": 770, "total_steps": 2436, "loss": 0.6744, "lr": 8.650136701186537e-06, "epoch": 0.9488601355514479, "percentage": 31.61, "elapsed_time": "0:47:56", "remaining_time": "1:43:43"}
|
| 79 |
+
{"current_steps": 780, "total_steps": 2436, "loss": 0.6921, "lr": 8.60078948521757e-06, "epoch": 0.9611829944547134, "percentage": 32.02, "elapsed_time": "0:48:30", "remaining_time": "1:42:59"}
|
| 80 |
+
{"current_steps": 790, "total_steps": 2436, "loss": 0.6956, "lr": 8.550702648742566e-06, "epoch": 0.973505853357979, "percentage": 32.43, "elapsed_time": "0:49:06", "remaining_time": "1:42:19"}
|
| 81 |
+
{"current_steps": 800, "total_steps": 2436, "loss": 0.7018, "lr": 8.499886479852935e-06, "epoch": 0.9858287122612446, "percentage": 32.84, "elapsed_time": "0:49:39", "remaining_time": "1:41:33"}
|
| 82 |
+
{"current_steps": 810, "total_steps": 2436, "loss": 0.7032, "lr": 8.448351416448664e-06, "epoch": 0.9981515711645101, "percentage": 33.25, "elapsed_time": "0:50:15", "remaining_time": "1:40:52"}
|
| 83 |
+
{"current_steps": 820, "total_steps": 2436, "loss": 0.595, "lr": 8.39610804409435e-06, "epoch": 1.0098582871226125, "percentage": 33.66, "elapsed_time": "0:50:47", "remaining_time": "1:40:06"}
|
| 84 |
+
{"current_steps": 830, "total_steps": 2436, "loss": 0.5946, "lr": 8.343167093844847e-06, "epoch": 1.022181146025878, "percentage": 34.07, "elapsed_time": "0:51:21", "remaining_time": "1:39:23"}
|
| 85 |
+
{"current_steps": 840, "total_steps": 2436, "loss": 0.5845, "lr": 8.289539440041066e-06, "epoch": 1.0345040049291436, "percentage": 34.48, "elapsed_time": "0:51:54", "remaining_time": "1:38:38"}
|
| 86 |
+
{"current_steps": 850, "total_steps": 2436, "loss": 0.584, "lr": 8.23523609807633e-06, "epoch": 1.0468268638324092, "percentage": 34.89, "elapsed_time": "0:52:26", "remaining_time": "1:37:51"}
|
| 87 |
+
{"current_steps": 860, "total_steps": 2436, "loss": 0.5847, "lr": 8.180268222133748e-06, "epoch": 1.0591497227356748, "percentage": 35.3, "elapsed_time": "0:53:01", "remaining_time": "1:37:09"}
|
| 88 |
+
{"current_steps": 870, "total_steps": 2436, "loss": 0.5925, "lr": 8.124647102895098e-06, "epoch": 1.0714725816389403, "percentage": 35.71, "elapsed_time": "0:53:35", "remaining_time": "1:36:27"}
|
| 89 |
+
{"current_steps": 880, "total_steps": 2436, "loss": 0.584, "lr": 8.068384165221657e-06, "epoch": 1.083795440542206, "percentage": 36.12, "elapsed_time": "0:54:09", "remaining_time": "1:35:45"}
|
| 90 |
+
{"current_steps": 890, "total_steps": 2436, "loss": 0.5932, "lr": 8.011490965807479e-06, "epoch": 1.0961182994454712, "percentage": 36.54, "elapsed_time": "0:54:42", "remaining_time": "1:35:01"}
|
| 91 |
+
{"current_steps": 900, "total_steps": 2436, "loss": 0.5642, "lr": 7.953979190805587e-06, "epoch": 1.1084411583487368, "percentage": 36.95, "elapsed_time": "0:55:17", "remaining_time": "1:34:21"}
|
| 92 |
+
{"current_steps": 910, "total_steps": 2436, "loss": 0.6045, "lr": 7.89586065342759e-06, "epoch": 1.1207640172520024, "percentage": 37.36, "elapsed_time": "0:55:49", "remaining_time": "1:33:37"}
|
| 93 |
+
{"current_steps": 920, "total_steps": 2436, "loss": 0.5985, "lr": 7.837147291517172e-06, "epoch": 1.133086876155268, "percentage": 37.77, "elapsed_time": "0:56:23", "remaining_time": "1:32:55"}
|
| 94 |
+
{"current_steps": 930, "total_steps": 2436, "loss": 0.5686, "lr": 7.777851165098012e-06, "epoch": 1.1454097350585335, "percentage": 38.18, "elapsed_time": "0:56:54", "remaining_time": "1:32:08"}
|
| 95 |
+
{"current_steps": 940, "total_steps": 2436, "loss": 0.5888, "lr": 7.717984453896585e-06, "epoch": 1.157732593961799, "percentage": 38.59, "elapsed_time": "0:57:30", "remaining_time": "1:31:30"}
|
| 96 |
+
{"current_steps": 950, "total_steps": 2436, "loss": 0.5813, "lr": 7.657559454840386e-06, "epoch": 1.1700554528650646, "percentage": 39.0, "elapsed_time": "0:58:03", "remaining_time": "1:30:49"}
|
| 97 |
+
{"current_steps": 960, "total_steps": 2436, "loss": 0.582, "lr": 7.596588579532087e-06, "epoch": 1.1823783117683302, "percentage": 39.41, "elapsed_time": "0:58:37", "remaining_time": "1:30:08"}
|
| 98 |
+
{"current_steps": 970, "total_steps": 2436, "loss": 0.5855, "lr": 7.535084351700117e-06, "epoch": 1.1947011706715958, "percentage": 39.82, "elapsed_time": "0:59:10", "remaining_time": "1:29:25"}
|
| 99 |
+
{"current_steps": 980, "total_steps": 2436, "loss": 0.5842, "lr": 7.473059404626229e-06, "epoch": 1.2070240295748613, "percentage": 40.23, "elapsed_time": "0:59:43", "remaining_time": "1:28:44"}
|
| 100 |
+
{"current_steps": 990, "total_steps": 2436, "loss": 0.5927, "lr": 7.410526478550568e-06, "epoch": 1.219346888478127, "percentage": 40.64, "elapsed_time": "1:00:18", "remaining_time": "1:28:04"}
|
| 101 |
+
{"current_steps": 1000, "total_steps": 2436, "loss": 0.5925, "lr": 7.34749841805475e-06, "epoch": 1.2316697473813925, "percentage": 41.05, "elapsed_time": "1:00:50", "remaining_time": "1:27:22"}
|
| 102 |
+
{"current_steps": 1000, "total_steps": 2436, "eval_loss": 0.6842279434204102, "epoch": 1.2316697473813925, "percentage": 41.05, "elapsed_time": "1:05:43", "remaining_time": "1:34:22"}
|
| 103 |
+
{"current_steps": 1010, "total_steps": 2436, "loss": 0.6095, "lr": 7.283988169423526e-06, "epoch": 1.243992606284658, "percentage": 41.46, "elapsed_time": "1:06:35", "remaining_time": "1:34:01"}
|
| 104 |
+
{"current_steps": 1020, "total_steps": 2436, "loss": 0.5953, "lr": 7.2200087779855435e-06, "epoch": 1.2563154651879236, "percentage": 41.87, "elapsed_time": "1:07:07", "remaining_time": "1:33:11"}
|
| 105 |
+
{"current_steps": 1030, "total_steps": 2436, "loss": 0.5913, "lr": 7.155573385433772e-06, "epoch": 1.2686383240911892, "percentage": 42.28, "elapsed_time": "1:07:41", "remaining_time": "1:32:24"}
|
| 106 |
+
{"current_steps": 1040, "total_steps": 2436, "loss": 0.6022, "lr": 7.090695227126141e-06, "epoch": 1.2809611829944547, "percentage": 42.69, "elapsed_time": "1:08:15", "remaining_time": "1:31:37"}
|
| 107 |
+
{"current_steps": 1050, "total_steps": 2436, "loss": 0.5891, "lr": 7.025387629366912e-06, "epoch": 1.2932840418977203, "percentage": 43.1, "elapsed_time": "1:08:51", "remaining_time": "1:30:53"}
|
| 108 |
+
{"current_steps": 1060, "total_steps": 2436, "loss": 0.578, "lr": 6.959664006669404e-06, "epoch": 1.3056069008009858, "percentage": 43.51, "elapsed_time": "1:09:22", "remaining_time": "1:30:03"}
|
| 109 |
+
{"current_steps": 1070, "total_steps": 2436, "loss": 0.5837, "lr": 6.893537859000576e-06, "epoch": 1.3179297597042514, "percentage": 43.92, "elapsed_time": "1:09:55", "remaining_time": "1:29:16"}
|
| 110 |
+
{"current_steps": 1080, "total_steps": 2436, "loss": 0.5783, "lr": 6.827022769008068e-06, "epoch": 1.330252618607517, "percentage": 44.33, "elapsed_time": "1:10:30", "remaining_time": "1:28:31"}
|
| 111 |
+
{"current_steps": 1090, "total_steps": 2436, "loss": 0.5882, "lr": 6.7601323992302525e-06, "epoch": 1.3425754775107825, "percentage": 44.75, "elapsed_time": "1:11:02", "remaining_time": "1:27:43"}
|
| 112 |
+
{"current_steps": 1100, "total_steps": 2436, "loss": 0.5889, "lr": 6.692880489289885e-06, "epoch": 1.354898336414048, "percentage": 45.16, "elapsed_time": "1:11:38", "remaining_time": "1:27:00"}
|
| 113 |
+
{"current_steps": 1110, "total_steps": 2436, "loss": 0.5886, "lr": 6.6252808530719095e-06, "epoch": 1.3672211953173137, "percentage": 45.57, "elapsed_time": "1:12:12", "remaining_time": "1:26:15"}
|
| 114 |
+
{"current_steps": 1120, "total_steps": 2436, "loss": 0.5737, "lr": 6.557347375886022e-06, "epoch": 1.3795440542205792, "percentage": 45.98, "elapsed_time": "1:12:45", "remaining_time": "1:25:29"}
|
| 115 |
+
{"current_steps": 1130, "total_steps": 2436, "loss": 0.5876, "lr": 6.489094011614553e-06, "epoch": 1.3918669131238448, "percentage": 46.39, "elapsed_time": "1:13:18", "remaining_time": "1:24:43"}
|
| 116 |
+
{"current_steps": 1140, "total_steps": 2436, "loss": 0.5917, "lr": 6.4205347798462704e-06, "epoch": 1.4041897720271104, "percentage": 46.8, "elapsed_time": "1:13:53", "remaining_time": "1:24:00"}
|
| 117 |
+
{"current_steps": 1150, "total_steps": 2436, "loss": 0.5836, "lr": 6.351683762996681e-06, "epoch": 1.416512630930376, "percentage": 47.21, "elapsed_time": "1:14:27", "remaining_time": "1:23:15"}
|
| 118 |
+
{"current_steps": 1160, "total_steps": 2436, "loss": 0.5783, "lr": 6.282555103415438e-06, "epoch": 1.4288354898336415, "percentage": 47.62, "elapsed_time": "1:15:03", "remaining_time": "1:22:34"}
|
| 119 |
+
{"current_steps": 1170, "total_steps": 2436, "loss": 0.5723, "lr": 6.213163000481428e-06, "epoch": 1.441158348736907, "percentage": 48.03, "elapsed_time": "1:15:37", "remaining_time": "1:21:49"}
|
| 120 |
+
{"current_steps": 1180, "total_steps": 2436, "loss": 0.6022, "lr": 6.143521707686137e-06, "epoch": 1.4534812076401726, "percentage": 48.44, "elapsed_time": "1:16:10", "remaining_time": "1:21:05"}
|
| 121 |
+
{"current_steps": 1190, "total_steps": 2436, "loss": 0.5917, "lr": 6.073645529705926e-06, "epoch": 1.4658040665434382, "percentage": 48.85, "elapsed_time": "1:16:45", "remaining_time": "1:20:22"}
|
| 122 |
+
{"current_steps": 1200, "total_steps": 2436, "loss": 0.5833, "lr": 6.0035488194637645e-06, "epoch": 1.4781269254467038, "percentage": 49.26, "elapsed_time": "1:17:16", "remaining_time": "1:19:35"}
|
| 123 |
+
{"current_steps": 1210, "total_steps": 2436, "loss": 0.5898, "lr": 5.933245975181074e-06, "epoch": 1.4904497843499693, "percentage": 49.67, "elapsed_time": "1:17:50", "remaining_time": "1:18:51"}
|
| 124 |
+
{"current_steps": 1220, "total_steps": 2436, "loss": 0.5842, "lr": 5.8627514374202596e-06, "epoch": 1.502772643253235, "percentage": 50.08, "elapsed_time": "1:18:23", "remaining_time": "1:18:08"}
|
| 125 |
+
{"current_steps": 1230, "total_steps": 2436, "loss": 0.5839, "lr": 5.79207968611854e-06, "epoch": 1.5150955021565005, "percentage": 50.49, "elapsed_time": "1:18:55", "remaining_time": "1:17:23"}
|
| 126 |
+
{"current_steps": 1240, "total_steps": 2436, "loss": 0.5764, "lr": 5.721245237613704e-06, "epoch": 1.527418361059766, "percentage": 50.9, "elapsed_time": "1:19:27", "remaining_time": "1:16:38"}
|
| 127 |
+
{"current_steps": 1250, "total_steps": 2436, "loss": 0.586, "lr": 5.650262641662367e-06, "epoch": 1.5397412199630314, "percentage": 51.31, "elapsed_time": "1:20:00", "remaining_time": "1:15:54"}
|
| 128 |
+
{"current_steps": 1260, "total_steps": 2436, "loss": 0.5702, "lr": 5.5791464784513905e-06, "epoch": 1.552064078866297, "percentage": 51.72, "elapsed_time": "1:20:34", "remaining_time": "1:15:11"}
|
| 129 |
+
{"current_steps": 1270, "total_steps": 2436, "loss": 0.5833, "lr": 5.50791135560303e-06, "epoch": 1.5643869377695625, "percentage": 52.13, "elapsed_time": "1:21:08", "remaining_time": "1:14:29"}
|
| 130 |
+
{"current_steps": 1280, "total_steps": 2436, "loss": 0.5738, "lr": 5.4365719051744556e-06, "epoch": 1.576709796672828, "percentage": 52.55, "elapsed_time": "1:21:40", "remaining_time": "1:13:45"}
|
| 131 |
+
{"current_steps": 1290, "total_steps": 2436, "loss": 0.5811, "lr": 5.365142780652255e-06, "epoch": 1.5890326555760936, "percentage": 52.96, "elapsed_time": "1:22:13", "remaining_time": "1:13:02"}
|
| 132 |
+
{"current_steps": 1300, "total_steps": 2436, "loss": 0.6035, "lr": 5.2936386539425325e-06, "epoch": 1.6013555144793592, "percentage": 53.37, "elapsed_time": "1:22:49", "remaining_time": "1:12:22"}
|
| 133 |
+
{"current_steps": 1310, "total_steps": 2436, "loss": 0.5767, "lr": 5.222074212357221e-06, "epoch": 1.6136783733826248, "percentage": 53.78, "elapsed_time": "1:23:23", "remaining_time": "1:11:40"}
|
| 134 |
+
{"current_steps": 1320, "total_steps": 2436, "loss": 0.5798, "lr": 5.150464155597239e-06, "epoch": 1.6260012322858903, "percentage": 54.19, "elapsed_time": "1:23:55", "remaining_time": "1:10:57"}
|
| 135 |
+
{"current_steps": 1330, "total_steps": 2436, "loss": 0.5757, "lr": 5.0788231927330924e-06, "epoch": 1.638324091189156, "percentage": 54.6, "elapsed_time": "1:24:28", "remaining_time": "1:10:15"}
|
| 136 |
+
{"current_steps": 1340, "total_steps": 2436, "loss": 0.5762, "lr": 5.007166039183561e-06, "epoch": 1.6506469500924215, "percentage": 55.01, "elapsed_time": "1:25:01", "remaining_time": "1:09:32"}
|
| 137 |
+
{"current_steps": 1350, "total_steps": 2436, "loss": 0.5758, "lr": 4.935507413693071e-06, "epoch": 1.662969808995687, "percentage": 55.42, "elapsed_time": "1:25:34", "remaining_time": "1:08:50"}
|
| 138 |
+
{"current_steps": 1360, "total_steps": 2436, "loss": 0.5727, "lr": 4.863862035308392e-06, "epoch": 1.6752926678989526, "percentage": 55.83, "elapsed_time": "1:26:09", "remaining_time": "1:08:10"}
|
| 139 |
+
{"current_steps": 1370, "total_steps": 2436, "loss": 0.5723, "lr": 4.792244620355275e-06, "epoch": 1.6876155268022182, "percentage": 56.24, "elapsed_time": "1:26:43", "remaining_time": "1:07:29"}
|
| 140 |
+
{"current_steps": 1380, "total_steps": 2436, "loss": 0.58, "lr": 4.720669879415637e-06, "epoch": 1.6999383857054837, "percentage": 56.65, "elapsed_time": "1:27:16", "remaining_time": "1:06:47"}
|
| 141 |
+
{"current_steps": 1390, "total_steps": 2436, "loss": 0.5831, "lr": 4.649152514305934e-06, "epoch": 1.712261244608749, "percentage": 57.06, "elapsed_time": "1:27:50", "remaining_time": "1:06:05"}
|
| 142 |
+
{"current_steps": 1400, "total_steps": 2436, "loss": 0.5647, "lr": 4.5777072150573355e-06, "epoch": 1.7245841035120146, "percentage": 57.47, "elapsed_time": "1:28:21", "remaining_time": "1:05:22"}
|
| 143 |
+
{"current_steps": 1410, "total_steps": 2436, "loss": 0.5631, "lr": 4.506348656898316e-06, "epoch": 1.7369069624152802, "percentage": 57.88, "elapsed_time": "1:28:57", "remaining_time": "1:04:43"}
|
| 144 |
+
{"current_steps": 1420, "total_steps": 2436, "loss": 0.5855, "lr": 4.435091497240287e-06, "epoch": 1.7492298213185458, "percentage": 58.29, "elapsed_time": "1:29:29", "remaining_time": "1:04:01"}
|
| 145 |
+
{"current_steps": 1430, "total_steps": 2436, "loss": 0.5714, "lr": 4.363950372666896e-06, "epoch": 1.7615526802218113, "percentage": 58.7, "elapsed_time": "1:30:02", "remaining_time": "1:03:20"}
|
| 146 |
+
{"current_steps": 1440, "total_steps": 2436, "loss": 0.5754, "lr": 4.292939895927587e-06, "epoch": 1.773875539125077, "percentage": 59.11, "elapsed_time": "1:30:33", "remaining_time": "1:02:38"}
|
| 147 |
+
{"current_steps": 1450, "total_steps": 2436, "loss": 0.5876, "lr": 4.2220746529360745e-06, "epoch": 1.7861983980283425, "percentage": 59.52, "elapsed_time": "1:31:06", "remaining_time": "1:01:56"}
|
| 148 |
+
{"current_steps": 1460, "total_steps": 2436, "loss": 0.5787, "lr": 4.151369199774325e-06, "epoch": 1.798521256931608, "percentage": 59.93, "elapsed_time": "1:31:39", "remaining_time": "1:01:16"}
|
| 149 |
+
{"current_steps": 1470, "total_steps": 2436, "loss": 0.5798, "lr": 4.080838059702656e-06, "epoch": 1.8108441158348736, "percentage": 60.34, "elapsed_time": "1:32:10", "remaining_time": "1:00:34"}
|
| 150 |
+
{"current_steps": 1480, "total_steps": 2436, "loss": 0.5813, "lr": 4.0104957201765874e-06, "epoch": 1.8231669747381392, "percentage": 60.76, "elapsed_time": "1:32:44", "remaining_time": "0:59:54"}
|
| 151 |
+
{"current_steps": 1490, "total_steps": 2436, "loss": 0.5737, "lr": 3.940356629871051e-06, "epoch": 1.8354898336414047, "percentage": 61.17, "elapsed_time": "1:33:17", "remaining_time": "0:59:13"}
|
| 152 |
+
{"current_steps": 1500, "total_steps": 2436, "loss": 0.5739, "lr": 3.870435195712547e-06, "epoch": 1.8478126925446703, "percentage": 61.58, "elapsed_time": "1:33:50", "remaining_time": "0:58:33"}
|
| 153 |
+
{"current_steps": 1500, "total_steps": 2436, "eval_loss": 0.6650247573852539, "epoch": 1.8478126925446703, "percentage": 61.58, "elapsed_time": "1:38:44", "remaining_time": "1:01:36"}
|
| 154 |
+
{"current_steps": 1510, "total_steps": 2436, "loss": 0.5996, "lr": 3.8007457799198977e-06, "epoch": 1.8601355514479359, "percentage": 61.99, "elapsed_time": "1:39:19", "remaining_time": "1:00:54"}
|
| 155 |
+
{"current_steps": 1520, "total_steps": 2436, "loss": 0.5686, "lr": 3.7313026970541687e-06, "epoch": 1.8724584103512014, "percentage": 62.4, "elapsed_time": "1:39:52", "remaining_time": "1:00:11"}
|
| 156 |
+
{"current_steps": 1530, "total_steps": 2436, "loss": 0.5814, "lr": 3.662120211078385e-06, "epoch": 1.884781269254467, "percentage": 62.81, "elapsed_time": "1:40:25", "remaining_time": "0:59:28"}
|
| 157 |
+
{"current_steps": 1540, "total_steps": 2436, "loss": 0.5821, "lr": 3.5932125324276524e-06, "epoch": 1.8971041281577325, "percentage": 63.22, "elapsed_time": "1:40:59", "remaining_time": "0:58:45"}
|
| 158 |
+
{"current_steps": 1550, "total_steps": 2436, "loss": 0.5753, "lr": 3.524593815090241e-06, "epoch": 1.9094269870609981, "percentage": 63.63, "elapsed_time": "1:41:34", "remaining_time": "0:58:03"}
|
| 159 |
+
{"current_steps": 1560, "total_steps": 2436, "loss": 0.5659, "lr": 3.4562781537003e-06, "epoch": 1.9217498459642637, "percentage": 64.04, "elapsed_time": "1:42:05", "remaining_time": "0:57:19"}
|
| 160 |
+
{"current_steps": 1570, "total_steps": 2436, "loss": 0.5705, "lr": 3.3882795806427437e-06, "epoch": 1.9340727048675292, "percentage": 64.45, "elapsed_time": "1:42:39", "remaining_time": "0:56:37"}
|
| 161 |
+
{"current_steps": 1580, "total_steps": 2436, "loss": 0.573, "lr": 3.320612063170926e-06, "epoch": 1.9463955637707948, "percentage": 64.86, "elapsed_time": "1:43:15", "remaining_time": "0:55:56"}
|
| 162 |
+
{"current_steps": 1590, "total_steps": 2436, "loss": 0.5594, "lr": 3.2532895005376943e-06, "epoch": 1.9587184226740604, "percentage": 65.27, "elapsed_time": "1:43:48", "remaining_time": "0:55:14"}
|
| 163 |
+
{"current_steps": 1600, "total_steps": 2436, "loss": 0.5682, "lr": 3.18632572114042e-06, "epoch": 1.971041281577326, "percentage": 65.68, "elapsed_time": "1:44:24", "remaining_time": "0:54:32"}
|
| 164 |
+
{"current_steps": 1610, "total_steps": 2436, "loss": 0.5642, "lr": 3.1197344796805675e-06, "epoch": 1.9833641404805915, "percentage": 66.09, "elapsed_time": "1:44:56", "remaining_time": "0:53:50"}
|
| 165 |
+
{"current_steps": 1620, "total_steps": 2436, "loss": 0.5691, "lr": 3.0535294543384074e-06, "epoch": 1.995686999383857, "percentage": 66.5, "elapsed_time": "1:45:31", "remaining_time": "0:53:08"}
|
| 166 |
+
{"current_steps": 1630, "total_steps": 2436, "loss": 0.4903, "lr": 2.987724243963458e-06, "epoch": 2.0073937153419594, "percentage": 66.91, "elapsed_time": "1:46:04", "remaining_time": "0:52:27"}
|
| 167 |
+
{"current_steps": 1640, "total_steps": 2436, "loss": 0.4882, "lr": 2.922332365281201e-06, "epoch": 2.019716574245225, "percentage": 67.32, "elapsed_time": "1:46:40", "remaining_time": "0:51:46"}
|
| 168 |
+
{"current_steps": 1650, "total_steps": 2436, "loss": 0.4714, "lr": 2.857367250116682e-06, "epoch": 2.0320394331484906, "percentage": 67.73, "elapsed_time": "1:47:13", "remaining_time": "0:51:04"}
|
| 169 |
+
{"current_steps": 1660, "total_steps": 2436, "loss": 0.4748, "lr": 2.7928422426355554e-06, "epoch": 2.044362292051756, "percentage": 68.14, "elapsed_time": "1:47:45", "remaining_time": "0:50:22"}
|
| 170 |
+
{"current_steps": 1670, "total_steps": 2436, "loss": 0.4764, "lr": 2.728770596603105e-06, "epoch": 2.0566851509550217, "percentage": 68.56, "elapsed_time": "1:48:20", "remaining_time": "0:49:41"}
|
| 171 |
+
{"current_steps": 1680, "total_steps": 2436, "loss": 0.4735, "lr": 2.665165472661866e-06, "epoch": 2.0690080098582873, "percentage": 68.97, "elapsed_time": "1:48:55", "remaining_time": "0:49:00"}
|
| 172 |
+
{"current_steps": 1690, "total_steps": 2436, "loss": 0.4578, "lr": 2.6020399356283586e-06, "epoch": 2.081330868761553, "percentage": 69.38, "elapsed_time": "1:49:30", "remaining_time": "0:48:20"}
|
| 173 |
+
{"current_steps": 1700, "total_steps": 2436, "loss": 0.4737, "lr": 2.539406951809512e-06, "epoch": 2.0936537276648184, "percentage": 69.79, "elapsed_time": "1:50:01", "remaining_time": "0:47:38"}
|
| 174 |
+
{"current_steps": 1710, "total_steps": 2436, "loss": 0.4695, "lr": 2.477279386339309e-06, "epoch": 2.105976586568084, "percentage": 70.2, "elapsed_time": "1:50:36", "remaining_time": "0:46:57"}
|
| 175 |
+
{"current_steps": 1720, "total_steps": 2436, "loss": 0.4785, "lr": 2.4156700005362384e-06, "epoch": 2.1182994454713495, "percentage": 70.61, "elapsed_time": "1:51:09", "remaining_time": "0:46:16"}
|
| 176 |
+
{"current_steps": 1730, "total_steps": 2436, "loss": 0.4723, "lr": 2.3545914492820366e-06, "epoch": 2.130622304374615, "percentage": 71.02, "elapsed_time": "1:51:44", "remaining_time": "0:45:35"}
|
| 177 |
+
{"current_steps": 1740, "total_steps": 2436, "loss": 0.4672, "lr": 2.2940562784223224e-06, "epoch": 2.1429451632778806, "percentage": 71.43, "elapsed_time": "1:52:15", "remaining_time": "0:44:54"}
|
| 178 |
+
{"current_steps": 1750, "total_steps": 2436, "loss": 0.4734, "lr": 2.234076922189613e-06, "epoch": 2.155268022181146, "percentage": 71.84, "elapsed_time": "1:52:49", "remaining_time": "0:44:13"}
|
| 179 |
+
{"current_steps": 1760, "total_steps": 2436, "loss": 0.4512, "lr": 2.174665700649267e-06, "epoch": 2.167590881084412, "percentage": 72.25, "elapsed_time": "1:53:23", "remaining_time": "0:43:33"}
|
| 180 |
+
{"current_steps": 1770, "total_steps": 2436, "loss": 0.4678, "lr": 2.1158348171688888e-06, "epoch": 2.1799137399876773, "percentage": 72.66, "elapsed_time": "1:53:56", "remaining_time": "0:42:52"}
|
| 181 |
+
{"current_steps": 1780, "total_steps": 2436, "loss": 0.4719, "lr": 2.0575963559116823e-06, "epoch": 2.1922365988909425, "percentage": 73.07, "elapsed_time": "1:54:29", "remaining_time": "0:42:11"}
|
| 182 |
+
{"current_steps": 1790, "total_steps": 2436, "loss": 0.4539, "lr": 1.999962279354311e-06, "epoch": 2.2045594577942085, "percentage": 73.48, "elapsed_time": "1:55:03", "remaining_time": "0:41:31"}
|
| 183 |
+
{"current_steps": 1800, "total_steps": 2436, "loss": 0.4759, "lr": 1.942944425829741e-06, "epoch": 2.2168823166974736, "percentage": 73.89, "elapsed_time": "1:55:35", "remaining_time": "0:40:50"}
|
| 184 |
+
{"current_steps": 1810, "total_steps": 2436, "loss": 0.4633, "lr": 1.8865545070955882e-06, "epoch": 2.229205175600739, "percentage": 74.3, "elapsed_time": "1:56:07", "remaining_time": "0:40:09"}
|
| 185 |
+
{"current_steps": 1820, "total_steps": 2436, "loss": 0.4683, "lr": 1.8308041059284621e-06, "epoch": 2.2415280345040047, "percentage": 74.71, "elapsed_time": "1:56:41", "remaining_time": "0:39:29"}
|
| 186 |
+
{"current_steps": 1830, "total_steps": 2436, "loss": 0.477, "lr": 1.775704673744809e-06, "epoch": 2.2538508934072703, "percentage": 75.12, "elapsed_time": "1:57:14", "remaining_time": "0:38:49"}
|
| 187 |
+
{"current_steps": 1840, "total_steps": 2436, "loss": 0.4792, "lr": 1.7212675282487269e-06, "epoch": 2.266173752310536, "percentage": 75.53, "elapsed_time": "1:57:50", "remaining_time": "0:38:10"}
|
| 188 |
+
{"current_steps": 1850, "total_steps": 2436, "loss": 0.4704, "lr": 1.6675038511072518e-06, "epoch": 2.2784966112138014, "percentage": 75.94, "elapsed_time": "1:58:21", "remaining_time": "0:37:29"}
|
| 189 |
+
{"current_steps": 1860, "total_steps": 2436, "loss": 0.4685, "lr": 1.6144246856535933e-06, "epoch": 2.290819470117067, "percentage": 76.35, "elapsed_time": "1:58:55", "remaining_time": "0:36:49"}
|
| 190 |
+
{"current_steps": 1870, "total_steps": 2436, "loss": 0.4786, "lr": 1.5620409346187697e-06, "epoch": 2.3031423290203326, "percentage": 76.77, "elapsed_time": "1:59:30", "remaining_time": "0:36:10"}
|
| 191 |
+
{"current_steps": 1880, "total_steps": 2436, "loss": 0.469, "lr": 1.510363357892133e-06, "epoch": 2.315465187923598, "percentage": 77.18, "elapsed_time": "2:00:03", "remaining_time": "0:35:30"}
|
| 192 |
+
{"current_steps": 1890, "total_steps": 2436, "loss": 0.4706, "lr": 1.4594025703112397e-06, "epoch": 2.3277880468268637, "percentage": 77.59, "elapsed_time": "2:00:34", "remaining_time": "0:34:50"}
|
| 193 |
+
{"current_steps": 1900, "total_steps": 2436, "loss": 0.4586, "lr": 1.4091690394814989e-06, "epoch": 2.3401109057301293, "percentage": 78.0, "elapsed_time": "2:01:07", "remaining_time": "0:34:10"}
|
| 194 |
+
{"current_steps": 1910, "total_steps": 2436, "loss": 0.4679, "lr": 1.359673083626079e-06, "epoch": 2.352433764633395, "percentage": 78.41, "elapsed_time": "2:01:41", "remaining_time": "0:33:30"}
|
| 195 |
+
{"current_steps": 1920, "total_steps": 2436, "loss": 0.4688, "lr": 1.3109248694664917e-06, "epoch": 2.3647566235366604, "percentage": 78.82, "elapsed_time": "2:02:17", "remaining_time": "0:32:52"}
|
| 196 |
+
{"current_steps": 1930, "total_steps": 2436, "loss": 0.4692, "lr": 1.262934410134292e-06, "epoch": 2.377079482439926, "percentage": 79.23, "elapsed_time": "2:02:52", "remaining_time": "0:32:12"}
|
| 197 |
+
{"current_steps": 1940, "total_steps": 2436, "loss": 0.4734, "lr": 1.2157115631143384e-06, "epoch": 2.3894023413431915, "percentage": 79.64, "elapsed_time": "2:03:27", "remaining_time": "0:31:33"}
|
| 198 |
+
{"current_steps": 1950, "total_steps": 2436, "loss": 0.4789, "lr": 1.169266028220004e-06, "epoch": 2.401725200246457, "percentage": 80.05, "elapsed_time": "2:03:58", "remaining_time": "0:30:53"}
|
| 199 |
+
{"current_steps": 1960, "total_steps": 2436, "loss": 0.4761, "lr": 1.1236073456007928e-06, "epoch": 2.4140480591497226, "percentage": 80.46, "elapsed_time": "2:04:32", "remaining_time": "0:30:14"}
|
| 200 |
+
{"current_steps": 1970, "total_steps": 2436, "loss": 0.4506, "lr": 1.0787448937827428e-06, "epoch": 2.426370918052988, "percentage": 80.87, "elapsed_time": "2:05:07", "remaining_time": "0:29:35"}
|
| 201 |
+
{"current_steps": 1980, "total_steps": 2436, "loss": 0.4661, "lr": 1.034687887742028e-06, "epoch": 2.438693776956254, "percentage": 81.28, "elapsed_time": "2:05:41", "remaining_time": "0:28:56"}
|
| 202 |
+
{"current_steps": 1990, "total_steps": 2436, "loss": 0.4535, "lr": 9.914453770121557e-07, "epoch": 2.4510166358595193, "percentage": 81.69, "elapsed_time": "2:06:14", "remaining_time": "0:28:17"}
|
| 203 |
+
{"current_steps": 2000, "total_steps": 2436, "loss": 0.4637, "lr": 9.490262438251496e-07, "epoch": 2.463339494762785, "percentage": 82.1, "elapsed_time": "2:06:45", "remaining_time": "0:27:38"}
|
| 204 |
+
{"current_steps": 2000, "total_steps": 2436, "eval_loss": 0.6939365267753601, "epoch": 2.463339494762785, "percentage": 82.1, "elapsed_time": "2:11:36", "remaining_time": "0:28:41"}
|
| 205 |
+
{"current_steps": 2010, "total_steps": 2436, "loss": 0.4673, "lr": 9.07439201287088e-07, "epoch": 2.4756623536660505, "percentage": 82.51, "elapsed_time": "2:12:35", "remaining_time": "0:28:06"}
|
| 206 |
+
{"current_steps": 2020, "total_steps": 2436, "loss": 0.4594, "lr": 8.666927915883905e-07, "epoch": 2.487985212569316, "percentage": 82.92, "elapsed_time": "2:13:08", "remaining_time": "0:27:25"}
|
| 207 |
+
{"current_steps": 2030, "total_steps": 2436, "loss": 0.4633, "lr": 8.2679538424921e-07, "epoch": 2.5003080714725816, "percentage": 83.33, "elapsed_time": "2:13:39", "remaining_time": "0:26:43"}
|
| 208 |
+
{"current_steps": 2040, "total_steps": 2436, "loss": 0.4716, "lr": 7.877551744002881e-07, "epoch": 2.512630930375847, "percentage": 83.74, "elapsed_time": "2:14:11", "remaining_time": "0:26:03"}
|
| 209 |
+
{"current_steps": 2050, "total_steps": 2436, "loss": 0.4748, "lr": 7.495801810996334e-07, "epoch": 2.5249537892791127, "percentage": 84.15, "elapsed_time": "2:14:45", "remaining_time": "0:25:22"}
|
| 210 |
+
{"current_steps": 2060, "total_steps": 2436, "loss": 0.4714, "lr": 7.122782456853722e-07, "epoch": 2.5372766481823783, "percentage": 84.56, "elapsed_time": "2:15:18", "remaining_time": "0:24:41"}
|
| 211 |
+
{"current_steps": 2070, "total_steps": 2436, "loss": 0.4745, "lr": 6.758570301650869e-07, "epoch": 2.549599507085644, "percentage": 84.98, "elapsed_time": "2:15:51", "remaining_time": "0:24:01"}
|
| 212 |
+
{"current_steps": 2080, "total_steps": 2436, "loss": 0.4633, "lr": 6.403240156420087e-07, "epoch": 2.5619223659889094, "percentage": 85.39, "elapsed_time": "2:16:25", "remaining_time": "0:23:20"}
|
| 213 |
+
{"current_steps": 2090, "total_steps": 2436, "loss": 0.4674, "lr": 6.056865007783602e-07, "epoch": 2.574245224892175, "percentage": 85.8, "elapsed_time": "2:16:58", "remaining_time": "0:22:40"}
|
| 214 |
+
{"current_steps": 2100, "total_steps": 2436, "loss": 0.4636, "lr": 5.7195160029617e-07, "epoch": 2.5865680837954406, "percentage": 86.21, "elapsed_time": "2:17:33", "remaining_time": "0:22:00"}
|
| 215 |
+
{"current_steps": 2110, "total_steps": 2436, "loss": 0.4612, "lr": 5.391262435158722e-07, "epoch": 2.598890942698706, "percentage": 86.62, "elapsed_time": "2:18:05", "remaining_time": "0:21:20"}
|
| 216 |
+
{"current_steps": 2120, "total_steps": 2436, "loss": 0.4548, "lr": 5.072171729329944e-07, "epoch": 2.6112138016019717, "percentage": 87.03, "elapsed_time": "2:18:39", "remaining_time": "0:20:40"}
|
| 217 |
+
{"current_steps": 2130, "total_steps": 2436, "loss": 0.4649, "lr": 4.7623094283320905e-07, "epoch": 2.6235366605052373, "percentage": 87.44, "elapsed_time": "2:19:10", "remaining_time": "0:19:59"}
|
| 218 |
+
{"current_steps": 2140, "total_steps": 2436, "loss": 0.4649, "lr": 4.4617391794604946e-07, "epoch": 2.635859519408503, "percentage": 87.85, "elapsed_time": "2:19:44", "remaining_time": "0:19:19"}
|
| 219 |
+
{"current_steps": 2150, "total_steps": 2436, "loss": 0.4729, "lr": 4.170522721375669e-07, "epoch": 2.6481823783117684, "percentage": 88.26, "elapsed_time": "2:20:19", "remaining_time": "0:18:39"}
|
| 220 |
+
{"current_steps": 2160, "total_steps": 2436, "loss": 0.4626, "lr": 3.8887198714218255e-07, "epoch": 2.660505237215034, "percentage": 88.67, "elapsed_time": "2:20:52", "remaining_time": "0:18:00"}
|
| 221 |
+
{"current_steps": 2170, "total_steps": 2436, "loss": 0.4618, "lr": 3.616388513340124e-07, "epoch": 2.6728280961182995, "percentage": 89.08, "elapsed_time": "2:21:24", "remaining_time": "0:17:20"}
|
| 222 |
+
{"current_steps": 2180, "total_steps": 2436, "loss": 0.4576, "lr": 3.3535845853790105e-07, "epoch": 2.685150955021565, "percentage": 89.49, "elapsed_time": "2:21:58", "remaining_time": "0:16:40"}
|
| 223 |
+
{"current_steps": 2190, "total_steps": 2436, "loss": 0.4557, "lr": 3.1003620688042636e-07, "epoch": 2.6974738139248307, "percentage": 89.9, "elapsed_time": "2:22:33", "remaining_time": "0:16:00"}
|
| 224 |
+
{"current_steps": 2200, "total_steps": 2436, "loss": 0.4687, "lr": 2.856772976810929e-07, "epoch": 2.709796672828096, "percentage": 90.31, "elapsed_time": "2:23:07", "remaining_time": "0:15:21"}
|
| 225 |
+
{"current_steps": 2210, "total_steps": 2436, "loss": 0.4617, "lr": 2.6228673438395804e-07, "epoch": 2.722119531731362, "percentage": 90.72, "elapsed_time": "2:23:39", "remaining_time": "0:14:41"}
|
| 226 |
+
{"current_steps": 2220, "total_steps": 2436, "loss": 0.4666, "lr": 2.398693215298953e-07, "epoch": 2.7344423906346274, "percentage": 91.13, "elapsed_time": "2:24:13", "remaining_time": "0:14:01"}
|
| 227 |
+
{"current_steps": 2230, "total_steps": 2436, "loss": 0.4757, "lr": 2.1842966376972142e-07, "epoch": 2.746765249537893, "percentage": 91.54, "elapsed_time": "2:24:48", "remaining_time": "0:13:22"}
|
| 228 |
+
{"current_steps": 2240, "total_steps": 2436, "loss": 0.4634, "lr": 1.9797216491837356e-07, "epoch": 2.7590881084411585, "percentage": 91.95, "elapsed_time": "2:25:23", "remaining_time": "0:12:43"}
|
| 229 |
+
{"current_steps": 2250, "total_steps": 2436, "loss": 0.4694, "lr": 1.7850102705034455e-07, "epoch": 2.771410967344424, "percentage": 92.36, "elapsed_time": "2:25:54", "remaining_time": "0:12:03"}
|
| 230 |
+
{"current_steps": 2260, "total_steps": 2436, "loss": 0.4637, "lr": 1.600202496365566e-07, "epoch": 2.7837338262476896, "percentage": 92.78, "elapsed_time": "2:26:26", "remaining_time": "0:11:24"}
|
| 231 |
+
{"current_steps": 2270, "total_steps": 2436, "loss": 0.4709, "lr": 1.425336287228496e-07, "epoch": 2.796056685150955, "percentage": 93.19, "elapsed_time": "2:27:00", "remaining_time": "0:10:45"}
|
| 232 |
+
{"current_steps": 2280, "total_steps": 2436, "loss": 0.46, "lr": 1.2604475615025092e-07, "epoch": 2.8083795440542207, "percentage": 93.6, "elapsed_time": "2:27:33", "remaining_time": "0:10:05"}
|
| 233 |
+
{"current_steps": 2290, "total_steps": 2436, "loss": 0.4722, "lr": 1.1055701881719838e-07, "epoch": 2.820702402957486, "percentage": 94.01, "elapsed_time": "2:28:08", "remaining_time": "0:09:26"}
|
| 234 |
+
{"current_steps": 2300, "total_steps": 2436, "loss": 0.4677, "lr": 9.607359798384785e-08, "epoch": 2.833025261860752, "percentage": 94.42, "elapsed_time": "2:28:42", "remaining_time": "0:08:47"}
|
| 235 |
+
{"current_steps": 2310, "total_steps": 2436, "loss": 0.4645, "lr": 8.259746861863094e-08, "epoch": 2.845348120764017, "percentage": 94.83, "elapsed_time": "2:29:17", "remaining_time": "0:08:08"}
|
| 236 |
+
{"current_steps": 2320, "total_steps": 2436, "loss": 0.4659, "lr": 7.013139878717934e-08, "epoch": 2.857670979667283, "percentage": 95.24, "elapsed_time": "2:29:50", "remaining_time": "0:07:29"}
|
| 237 |
+
{"current_steps": 2330, "total_steps": 2436, "loss": 0.4717, "lr": 5.8677949083749686e-08, "epoch": 2.869993838570548, "percentage": 95.65, "elapsed_time": "2:30:24", "remaining_time": "0:06:50"}
|
| 238 |
+
{"current_steps": 2340, "total_steps": 2436, "loss": 0.4562, "lr": 4.823947210526647e-08, "epoch": 2.882316697473814, "percentage": 96.06, "elapsed_time": "2:30:58", "remaining_time": "0:06:11"}
|
| 239 |
+
{"current_steps": 2350, "total_steps": 2436, "loss": 0.4784, "lr": 3.8818111968083607e-08, "epoch": 2.8946395563770793, "percentage": 96.47, "elapsed_time": "2:31:32", "remaining_time": "0:05:32"}
|
| 240 |
+
{"current_steps": 2360, "total_steps": 2436, "loss": 0.4736, "lr": 3.041580386757448e-08, "epoch": 2.9069624152803453, "percentage": 96.88, "elapsed_time": "2:32:04", "remaining_time": "0:04:53"}
|
| 241 |
+
{"current_steps": 2370, "total_steps": 2436, "loss": 0.4658, "lr": 2.3034273680632157e-08, "epoch": 2.9192852741836104, "percentage": 97.29, "elapsed_time": "2:32:38", "remaining_time": "0:04:15"}
|
| 242 |
+
{"current_steps": 2380, "total_steps": 2436, "loss": 0.4962, "lr": 1.6675037611165735e-08, "epoch": 2.9316081330868764, "percentage": 97.7, "elapsed_time": "2:33:09", "remaining_time": "0:03:36"}
|
| 243 |
+
{"current_steps": 2390, "total_steps": 2436, "loss": 0.4672, "lr": 1.1339401878663337e-08, "epoch": 2.9439309919901415, "percentage": 98.11, "elapsed_time": "2:33:44", "remaining_time": "0:02:57"}
|
| 244 |
+
{"current_steps": 2400, "total_steps": 2436, "loss": 0.4633, "lr": 7.028462449889528e-09, "epoch": 2.9562538508934075, "percentage": 98.52, "elapsed_time": "2:34:18", "remaining_time": "0:02:18"}
|
| 245 |
+
{"current_steps": 2410, "total_steps": 2436, "loss": 0.4541, "lr": 3.743104813767051e-09, "epoch": 2.9685767097966727, "percentage": 98.93, "elapsed_time": "2:34:50", "remaining_time": "0:01:40"}
|
| 246 |
+
{"current_steps": 2420, "total_steps": 2436, "loss": 0.4567, "lr": 1.4840037994923173e-09, "epoch": 2.9808995686999387, "percentage": 99.34, "elapsed_time": "2:35:22", "remaining_time": "0:01:01"}
|
| 247 |
+
{"current_steps": 2430, "total_steps": 2436, "loss": 0.4601, "lr": 2.516234379235094e-10, "epoch": 2.993222427603204, "percentage": 99.75, "elapsed_time": "2:35:55", "remaining_time": "0:00:23"}
|
| 248 |
+
{"current_steps": 2436, "total_steps": 2436, "epoch": 3.0, "percentage": 100.0, "elapsed_time": "2:36:36", "remaining_time": "0:00:00"}
|
EXP_1.1_3b/trainer_state.json
ADDED
|
@@ -0,0 +1,1776 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"best_global_step": null,
|
| 3 |
+
"best_metric": null,
|
| 4 |
+
"best_model_checkpoint": null,
|
| 5 |
+
"epoch": 3.0,
|
| 6 |
+
"eval_steps": 500,
|
| 7 |
+
"global_step": 2436,
|
| 8 |
+
"is_hyper_param_search": false,
|
| 9 |
+
"is_local_process_zero": true,
|
| 10 |
+
"is_world_process_zero": true,
|
| 11 |
+
"log_history": [
|
| 12 |
+
{
|
| 13 |
+
"epoch": 0.012322858903265557,
|
| 14 |
+
"grad_norm": 8.865929400670435,
|
| 15 |
+
"learning_rate": 3.6885245901639347e-07,
|
| 16 |
+
"loss": 1.5881,
|
| 17 |
+
"step": 10
|
| 18 |
+
},
|
| 19 |
+
{
|
| 20 |
+
"epoch": 0.024645717806531114,
|
| 21 |
+
"grad_norm": 7.574103078308896,
|
| 22 |
+
"learning_rate": 7.78688524590164e-07,
|
| 23 |
+
"loss": 1.6078,
|
| 24 |
+
"step": 20
|
| 25 |
+
},
|
| 26 |
+
{
|
| 27 |
+
"epoch": 0.036968576709796676,
|
| 28 |
+
"grad_norm": 6.183109440553045,
|
| 29 |
+
"learning_rate": 1.1885245901639345e-06,
|
| 30 |
+
"loss": 1.5155,
|
| 31 |
+
"step": 30
|
| 32 |
+
},
|
| 33 |
+
{
|
| 34 |
+
"epoch": 0.04929143561306223,
|
| 35 |
+
"grad_norm": 3.7877912491073786,
|
| 36 |
+
"learning_rate": 1.5983606557377053e-06,
|
| 37 |
+
"loss": 1.3049,
|
| 38 |
+
"step": 40
|
| 39 |
+
},
|
| 40 |
+
{
|
| 41 |
+
"epoch": 0.061614294516327786,
|
| 42 |
+
"grad_norm": 2.116426567902146,
|
| 43 |
+
"learning_rate": 2.0081967213114756e-06,
|
| 44 |
+
"loss": 1.1069,
|
| 45 |
+
"step": 50
|
| 46 |
+
},
|
| 47 |
+
{
|
| 48 |
+
"epoch": 0.07393715341959335,
|
| 49 |
+
"grad_norm": 1.9333052226557788,
|
| 50 |
+
"learning_rate": 2.418032786885246e-06,
|
| 51 |
+
"loss": 0.9964,
|
| 52 |
+
"step": 60
|
| 53 |
+
},
|
| 54 |
+
{
|
| 55 |
+
"epoch": 0.0862600123228589,
|
| 56 |
+
"grad_norm": 1.7274261944769131,
|
| 57 |
+
"learning_rate": 2.8278688524590166e-06,
|
| 58 |
+
"loss": 0.9526,
|
| 59 |
+
"step": 70
|
| 60 |
+
},
|
| 61 |
+
{
|
| 62 |
+
"epoch": 0.09858287122612445,
|
| 63 |
+
"grad_norm": 1.485084447219836,
|
| 64 |
+
"learning_rate": 3.2377049180327876e-06,
|
| 65 |
+
"loss": 0.9071,
|
| 66 |
+
"step": 80
|
| 67 |
+
},
|
| 68 |
+
{
|
| 69 |
+
"epoch": 0.11090573012939002,
|
| 70 |
+
"grad_norm": 1.6878869537634686,
|
| 71 |
+
"learning_rate": 3.6475409836065577e-06,
|
| 72 |
+
"loss": 0.8833,
|
| 73 |
+
"step": 90
|
| 74 |
+
},
|
| 75 |
+
{
|
| 76 |
+
"epoch": 0.12322858903265557,
|
| 77 |
+
"grad_norm": 1.6224576695093869,
|
| 78 |
+
"learning_rate": 4.057377049180329e-06,
|
| 79 |
+
"loss": 0.8617,
|
| 80 |
+
"step": 100
|
| 81 |
+
},
|
| 82 |
+
{
|
| 83 |
+
"epoch": 0.13555144793592114,
|
| 84 |
+
"grad_norm": 1.5341177753373625,
|
| 85 |
+
"learning_rate": 4.467213114754098e-06,
|
| 86 |
+
"loss": 0.8439,
|
| 87 |
+
"step": 110
|
| 88 |
+
},
|
| 89 |
+
{
|
| 90 |
+
"epoch": 0.1478743068391867,
|
| 91 |
+
"grad_norm": 1.680475740650815,
|
| 92 |
+
"learning_rate": 4.877049180327869e-06,
|
| 93 |
+
"loss": 0.8494,
|
| 94 |
+
"step": 120
|
| 95 |
+
},
|
| 96 |
+
{
|
| 97 |
+
"epoch": 0.16019716574245224,
|
| 98 |
+
"grad_norm": 1.8320444027458695,
|
| 99 |
+
"learning_rate": 5.286885245901639e-06,
|
| 100 |
+
"loss": 0.8114,
|
| 101 |
+
"step": 130
|
| 102 |
+
},
|
| 103 |
+
{
|
| 104 |
+
"epoch": 0.1725200246457178,
|
| 105 |
+
"grad_norm": 1.653101255301024,
|
| 106 |
+
"learning_rate": 5.696721311475411e-06,
|
| 107 |
+
"loss": 0.7865,
|
| 108 |
+
"step": 140
|
| 109 |
+
},
|
| 110 |
+
{
|
| 111 |
+
"epoch": 0.18484288354898337,
|
| 112 |
+
"grad_norm": 1.845033324261731,
|
| 113 |
+
"learning_rate": 6.10655737704918e-06,
|
| 114 |
+
"loss": 0.8102,
|
| 115 |
+
"step": 150
|
| 116 |
+
},
|
| 117 |
+
{
|
| 118 |
+
"epoch": 0.1971657424522489,
|
| 119 |
+
"grad_norm": 1.50017872275538,
|
| 120 |
+
"learning_rate": 6.516393442622952e-06,
|
| 121 |
+
"loss": 0.7978,
|
| 122 |
+
"step": 160
|
| 123 |
+
},
|
| 124 |
+
{
|
| 125 |
+
"epoch": 0.20948860135551448,
|
| 126 |
+
"grad_norm": 1.7319722249969618,
|
| 127 |
+
"learning_rate": 6.926229508196722e-06,
|
| 128 |
+
"loss": 0.7969,
|
| 129 |
+
"step": 170
|
| 130 |
+
},
|
| 131 |
+
{
|
| 132 |
+
"epoch": 0.22181146025878004,
|
| 133 |
+
"grad_norm": 1.5125736552951485,
|
| 134 |
+
"learning_rate": 7.336065573770492e-06,
|
| 135 |
+
"loss": 0.7821,
|
| 136 |
+
"step": 180
|
| 137 |
+
},
|
| 138 |
+
{
|
| 139 |
+
"epoch": 0.2341343191620456,
|
| 140 |
+
"grad_norm": 1.8457968609015014,
|
| 141 |
+
"learning_rate": 7.745901639344263e-06,
|
| 142 |
+
"loss": 0.7654,
|
| 143 |
+
"step": 190
|
| 144 |
+
},
|
| 145 |
+
{
|
| 146 |
+
"epoch": 0.24645717806531114,
|
| 147 |
+
"grad_norm": 1.612579047273975,
|
| 148 |
+
"learning_rate": 8.155737704918034e-06,
|
| 149 |
+
"loss": 0.7924,
|
| 150 |
+
"step": 200
|
| 151 |
+
},
|
| 152 |
+
{
|
| 153 |
+
"epoch": 0.2587800369685767,
|
| 154 |
+
"grad_norm": 1.7222316881259245,
|
| 155 |
+
"learning_rate": 8.565573770491804e-06,
|
| 156 |
+
"loss": 0.7754,
|
| 157 |
+
"step": 210
|
| 158 |
+
},
|
| 159 |
+
{
|
| 160 |
+
"epoch": 0.2711028958718423,
|
| 161 |
+
"grad_norm": 1.6823495566135107,
|
| 162 |
+
"learning_rate": 8.975409836065575e-06,
|
| 163 |
+
"loss": 0.7685,
|
| 164 |
+
"step": 220
|
| 165 |
+
},
|
| 166 |
+
{
|
| 167 |
+
"epoch": 0.28342575477510784,
|
| 168 |
+
"grad_norm": 1.8243123794941252,
|
| 169 |
+
"learning_rate": 9.385245901639345e-06,
|
| 170 |
+
"loss": 0.766,
|
| 171 |
+
"step": 230
|
| 172 |
+
},
|
| 173 |
+
{
|
| 174 |
+
"epoch": 0.2957486136783734,
|
| 175 |
+
"grad_norm": 1.6969329352168785,
|
| 176 |
+
"learning_rate": 9.795081967213116e-06,
|
| 177 |
+
"loss": 0.7709,
|
| 178 |
+
"step": 240
|
| 179 |
+
},
|
| 180 |
+
{
|
| 181 |
+
"epoch": 0.3080714725816389,
|
| 182 |
+
"grad_norm": 1.5658877986677437,
|
| 183 |
+
"learning_rate": 9.999871620167532e-06,
|
| 184 |
+
"loss": 0.7434,
|
| 185 |
+
"step": 250
|
| 186 |
+
},
|
| 187 |
+
{
|
| 188 |
+
"epoch": 0.3203943314849045,
|
| 189 |
+
"grad_norm": 1.7193380497814985,
|
| 190 |
+
"learning_rate": 9.998844621062755e-06,
|
| 191 |
+
"loss": 0.7559,
|
| 192 |
+
"step": 260
|
| 193 |
+
},
|
| 194 |
+
{
|
| 195 |
+
"epoch": 0.33271719038817005,
|
| 196 |
+
"grad_norm": 1.769267118394799,
|
| 197 |
+
"learning_rate": 9.996790833804053e-06,
|
| 198 |
+
"loss": 0.7541,
|
| 199 |
+
"step": 270
|
| 200 |
+
},
|
| 201 |
+
{
|
| 202 |
+
"epoch": 0.3450400492914356,
|
| 203 |
+
"grad_norm": 1.6377922328571806,
|
| 204 |
+
"learning_rate": 9.993710680249788e-06,
|
| 205 |
+
"loss": 0.7565,
|
| 206 |
+
"step": 280
|
| 207 |
+
},
|
| 208 |
+
{
|
| 209 |
+
"epoch": 0.3573629081947012,
|
| 210 |
+
"grad_norm": 1.6202653826059523,
|
| 211 |
+
"learning_rate": 9.989604793079198e-06,
|
| 212 |
+
"loss": 0.7488,
|
| 213 |
+
"step": 290
|
| 214 |
+
},
|
| 215 |
+
{
|
| 216 |
+
"epoch": 0.36968576709796674,
|
| 217 |
+
"grad_norm": 1.7411263842465872,
|
| 218 |
+
"learning_rate": 9.984474015662421e-06,
|
| 219 |
+
"loss": 0.7542,
|
| 220 |
+
"step": 300
|
| 221 |
+
},
|
| 222 |
+
{
|
| 223 |
+
"epoch": 0.3820086260012323,
|
| 224 |
+
"grad_norm": 1.7305673660844891,
|
| 225 |
+
"learning_rate": 9.978319401887287e-06,
|
| 226 |
+
"loss": 0.7626,
|
| 227 |
+
"step": 310
|
| 228 |
+
},
|
| 229 |
+
{
|
| 230 |
+
"epoch": 0.3943314849044978,
|
| 231 |
+
"grad_norm": 1.6891856351420287,
|
| 232 |
+
"learning_rate": 9.971142215942817e-06,
|
| 233 |
+
"loss": 0.7445,
|
| 234 |
+
"step": 320
|
| 235 |
+
},
|
| 236 |
+
{
|
| 237 |
+
"epoch": 0.4066543438077634,
|
| 238 |
+
"grad_norm": 1.5418334432889922,
|
| 239 |
+
"learning_rate": 9.962943932059573e-06,
|
| 240 |
+
"loss": 0.7438,
|
| 241 |
+
"step": 330
|
| 242 |
+
},
|
| 243 |
+
{
|
| 244 |
+
"epoch": 0.41897720271102895,
|
| 245 |
+
"grad_norm": 1.5266927582987573,
|
| 246 |
+
"learning_rate": 9.953726234206835e-06,
|
| 247 |
+
"loss": 0.7403,
|
| 248 |
+
"step": 340
|
| 249 |
+
},
|
| 250 |
+
{
|
| 251 |
+
"epoch": 0.4313000616142945,
|
| 252 |
+
"grad_norm": 1.6404004809298283,
|
| 253 |
+
"learning_rate": 9.943491015746704e-06,
|
| 254 |
+
"loss": 0.7459,
|
| 255 |
+
"step": 350
|
| 256 |
+
},
|
| 257 |
+
{
|
| 258 |
+
"epoch": 0.4436229205175601,
|
| 259 |
+
"grad_norm": 1.5048534772298556,
|
| 260 |
+
"learning_rate": 9.9322403790452e-06,
|
| 261 |
+
"loss": 0.7361,
|
| 262 |
+
"step": 360
|
| 263 |
+
},
|
| 264 |
+
{
|
| 265 |
+
"epoch": 0.45594577942082565,
|
| 266 |
+
"grad_norm": 1.6069777870788664,
|
| 267 |
+
"learning_rate": 9.919976635040425e-06,
|
| 268 |
+
"loss": 0.7373,
|
| 269 |
+
"step": 370
|
| 270 |
+
},
|
| 271 |
+
{
|
| 272 |
+
"epoch": 0.4682686383240912,
|
| 273 |
+
"grad_norm": 1.479252474067406,
|
| 274 |
+
"learning_rate": 9.906702302767876e-06,
|
| 275 |
+
"loss": 0.7279,
|
| 276 |
+
"step": 380
|
| 277 |
+
},
|
| 278 |
+
{
|
| 279 |
+
"epoch": 0.4805914972273567,
|
| 280 |
+
"grad_norm": 1.54960054924232,
|
| 281 |
+
"learning_rate": 9.892420108843038e-06,
|
| 282 |
+
"loss": 0.7503,
|
| 283 |
+
"step": 390
|
| 284 |
+
},
|
| 285 |
+
{
|
| 286 |
+
"epoch": 0.4929143561306223,
|
| 287 |
+
"grad_norm": 1.84677255909603,
|
| 288 |
+
"learning_rate": 9.877132986901306e-06,
|
| 289 |
+
"loss": 0.7528,
|
| 290 |
+
"step": 400
|
| 291 |
+
},
|
| 292 |
+
{
|
| 293 |
+
"epoch": 0.5052372150338879,
|
| 294 |
+
"grad_norm": 1.6037926513380787,
|
| 295 |
+
"learning_rate": 9.860844076995416e-06,
|
| 296 |
+
"loss": 0.7472,
|
| 297 |
+
"step": 410
|
| 298 |
+
},
|
| 299 |
+
{
|
| 300 |
+
"epoch": 0.5175600739371534,
|
| 301 |
+
"grad_norm": 1.655327892600129,
|
| 302 |
+
"learning_rate": 9.843556724950454e-06,
|
| 303 |
+
"loss": 0.7338,
|
| 304 |
+
"step": 420
|
| 305 |
+
},
|
| 306 |
+
{
|
| 307 |
+
"epoch": 0.529882932840419,
|
| 308 |
+
"grad_norm": 1.5442494735316896,
|
| 309 |
+
"learning_rate": 9.825274481676605e-06,
|
| 310 |
+
"loss": 0.716,
|
| 311 |
+
"step": 430
|
| 312 |
+
},
|
| 313 |
+
{
|
| 314 |
+
"epoch": 0.5422057917436846,
|
| 315 |
+
"grad_norm": 1.5124305778248208,
|
| 316 |
+
"learning_rate": 9.806001102439789e-06,
|
| 317 |
+
"loss": 0.7381,
|
| 318 |
+
"step": 440
|
| 319 |
+
},
|
| 320 |
+
{
|
| 321 |
+
"epoch": 0.5545286506469501,
|
| 322 |
+
"grad_norm": 1.560981504733344,
|
| 323 |
+
"learning_rate": 9.785740546090293e-06,
|
| 324 |
+
"loss": 0.72,
|
| 325 |
+
"step": 450
|
| 326 |
+
},
|
| 327 |
+
{
|
| 328 |
+
"epoch": 0.5668515095502157,
|
| 329 |
+
"grad_norm": 1.5093735463852636,
|
| 330 |
+
"learning_rate": 9.76449697424962e-06,
|
| 331 |
+
"loss": 0.7302,
|
| 332 |
+
"step": 460
|
| 333 |
+
},
|
| 334 |
+
{
|
| 335 |
+
"epoch": 0.5791743684534812,
|
| 336 |
+
"grad_norm": 1.5144954032372582,
|
| 337 |
+
"learning_rate": 9.742274750455659e-06,
|
| 338 |
+
"loss": 0.7165,
|
| 339 |
+
"step": 470
|
| 340 |
+
},
|
| 341 |
+
{
|
| 342 |
+
"epoch": 0.5914972273567468,
|
| 343 |
+
"grad_norm": 1.4515433945811713,
|
| 344 |
+
"learning_rate": 9.719078439266399e-06,
|
| 345 |
+
"loss": 0.715,
|
| 346 |
+
"step": 480
|
| 347 |
+
},
|
| 348 |
+
{
|
| 349 |
+
"epoch": 0.6038200862600123,
|
| 350 |
+
"grad_norm": 1.5477114404836352,
|
| 351 |
+
"learning_rate": 9.69491280532234e-06,
|
| 352 |
+
"loss": 0.7328,
|
| 353 |
+
"step": 490
|
| 354 |
+
},
|
| 355 |
+
{
|
| 356 |
+
"epoch": 0.6161429451632778,
|
| 357 |
+
"grad_norm": 1.5020199584697078,
|
| 358 |
+
"learning_rate": 9.66978281236782e-06,
|
| 359 |
+
"loss": 0.705,
|
| 360 |
+
"step": 500
|
| 361 |
+
},
|
| 362 |
+
{
|
| 363 |
+
"epoch": 0.6161429451632778,
|
| 364 |
+
"eval_loss": 0.7097772359848022,
|
| 365 |
+
"eval_runtime": 289.5537,
|
| 366 |
+
"eval_samples_per_second": 19.924,
|
| 367 |
+
"eval_steps_per_second": 2.493,
|
| 368 |
+
"step": 500
|
| 369 |
+
},
|
| 370 |
+
{
|
| 371 |
+
"epoch": 0.6284658040665434,
|
| 372 |
+
"grad_norm": 1.6670022717865285,
|
| 373 |
+
"learning_rate": 9.643693622231426e-06,
|
| 374 |
+
"loss": 0.7196,
|
| 375 |
+
"step": 510
|
| 376 |
+
},
|
| 377 |
+
{
|
| 378 |
+
"epoch": 0.640788662969809,
|
| 379 |
+
"grad_norm": 1.404510439712067,
|
| 380 |
+
"learning_rate": 9.616650593765733e-06,
|
| 381 |
+
"loss": 0.7212,
|
| 382 |
+
"step": 520
|
| 383 |
+
},
|
| 384 |
+
{
|
| 385 |
+
"epoch": 0.6531115218730745,
|
| 386 |
+
"grad_norm": 1.5052601371891572,
|
| 387 |
+
"learning_rate": 9.58865928174657e-06,
|
| 388 |
+
"loss": 0.7038,
|
| 389 |
+
"step": 530
|
| 390 |
+
},
|
| 391 |
+
{
|
| 392 |
+
"epoch": 0.6654343807763401,
|
| 393 |
+
"grad_norm": 1.4958093674390802,
|
| 394 |
+
"learning_rate": 9.559725435732042e-06,
|
| 395 |
+
"loss": 0.7108,
|
| 396 |
+
"step": 540
|
| 397 |
+
},
|
| 398 |
+
{
|
| 399 |
+
"epoch": 0.6777572396796057,
|
| 400 |
+
"grad_norm": 1.5318847750594449,
|
| 401 |
+
"learning_rate": 9.529854998881534e-06,
|
| 402 |
+
"loss": 0.7164,
|
| 403 |
+
"step": 550
|
| 404 |
+
},
|
| 405 |
+
{
|
| 406 |
+
"epoch": 0.6900800985828712,
|
| 407 |
+
"grad_norm": 1.5019475478688826,
|
| 408 |
+
"learning_rate": 9.499054106734963e-06,
|
| 409 |
+
"loss": 0.7183,
|
| 410 |
+
"step": 560
|
| 411 |
+
},
|
| 412 |
+
{
|
| 413 |
+
"epoch": 0.7024029574861368,
|
| 414 |
+
"grad_norm": 1.6065560053372951,
|
| 415 |
+
"learning_rate": 9.467329085952505e-06,
|
| 416 |
+
"loss": 0.7082,
|
| 417 |
+
"step": 570
|
| 418 |
+
},
|
| 419 |
+
{
|
| 420 |
+
"epoch": 0.7147258163894024,
|
| 421 |
+
"grad_norm": 1.5660643966910845,
|
| 422 |
+
"learning_rate": 9.434686453015067e-06,
|
| 423 |
+
"loss": 0.7136,
|
| 424 |
+
"step": 580
|
| 425 |
+
},
|
| 426 |
+
{
|
| 427 |
+
"epoch": 0.7270486752926679,
|
| 428 |
+
"grad_norm": 1.5293218100937476,
|
| 429 |
+
"learning_rate": 9.401132912885764e-06,
|
| 430 |
+
"loss": 0.7225,
|
| 431 |
+
"step": 590
|
| 432 |
+
},
|
| 433 |
+
{
|
| 434 |
+
"epoch": 0.7393715341959335,
|
| 435 |
+
"grad_norm": 1.5434780622861681,
|
| 436 |
+
"learning_rate": 9.36667535763269e-06,
|
| 437 |
+
"loss": 0.7133,
|
| 438 |
+
"step": 600
|
| 439 |
+
},
|
| 440 |
+
{
|
| 441 |
+
"epoch": 0.751694393099199,
|
| 442 |
+
"grad_norm": 1.6408253382203255,
|
| 443 |
+
"learning_rate": 9.331320865013257e-06,
|
| 444 |
+
"loss": 0.7063,
|
| 445 |
+
"step": 610
|
| 446 |
+
},
|
| 447 |
+
{
|
| 448 |
+
"epoch": 0.7640172520024646,
|
| 449 |
+
"grad_norm": 1.4072416974078221,
|
| 450 |
+
"learning_rate": 9.295076697020378e-06,
|
| 451 |
+
"loss": 0.6986,
|
| 452 |
+
"step": 620
|
| 453 |
+
},
|
| 454 |
+
{
|
| 455 |
+
"epoch": 0.7763401109057301,
|
| 456 |
+
"grad_norm": 1.472755575560557,
|
| 457 |
+
"learning_rate": 9.257950298390815e-06,
|
| 458 |
+
"loss": 0.706,
|
| 459 |
+
"step": 630
|
| 460 |
+
},
|
| 461 |
+
{
|
| 462 |
+
"epoch": 0.7886629698089956,
|
| 463 |
+
"grad_norm": 1.5142987867097109,
|
| 464 |
+
"learning_rate": 9.219949295076006e-06,
|
| 465 |
+
"loss": 0.7083,
|
| 466 |
+
"step": 640
|
| 467 |
+
},
|
| 468 |
+
{
|
| 469 |
+
"epoch": 0.8009858287122612,
|
| 470 |
+
"grad_norm": 1.7078530382768096,
|
| 471 |
+
"learning_rate": 9.181081492675645e-06,
|
| 472 |
+
"loss": 0.7066,
|
| 473 |
+
"step": 650
|
| 474 |
+
},
|
| 475 |
+
{
|
| 476 |
+
"epoch": 0.8133086876155268,
|
| 477 |
+
"grad_norm": 1.4156076970190257,
|
| 478 |
+
"learning_rate": 9.141354874834372e-06,
|
| 479 |
+
"loss": 0.6922,
|
| 480 |
+
"step": 660
|
| 481 |
+
},
|
| 482 |
+
{
|
| 483 |
+
"epoch": 0.8256315465187923,
|
| 484 |
+
"grad_norm": 1.489267836422785,
|
| 485 |
+
"learning_rate": 9.100777601601896e-06,
|
| 486 |
+
"loss": 0.686,
|
| 487 |
+
"step": 670
|
| 488 |
+
},
|
| 489 |
+
{
|
| 490 |
+
"epoch": 0.8379544054220579,
|
| 491 |
+
"grad_norm": 1.5156823023774049,
|
| 492 |
+
"learning_rate": 9.05935800775688e-06,
|
| 493 |
+
"loss": 0.7045,
|
| 494 |
+
"step": 680
|
| 495 |
+
},
|
| 496 |
+
{
|
| 497 |
+
"epoch": 0.8502772643253235,
|
| 498 |
+
"grad_norm": 1.4997012195145634,
|
| 499 |
+
"learning_rate": 9.017104601094927e-06,
|
| 500 |
+
"loss": 0.6952,
|
| 501 |
+
"step": 690
|
| 502 |
+
},
|
| 503 |
+
{
|
| 504 |
+
"epoch": 0.862600123228589,
|
| 505 |
+
"grad_norm": 1.4993129316246536,
|
| 506 |
+
"learning_rate": 8.974026060681044e-06,
|
| 507 |
+
"loss": 0.6905,
|
| 508 |
+
"step": 700
|
| 509 |
+
},
|
| 510 |
+
{
|
| 511 |
+
"epoch": 0.8749229821318546,
|
| 512 |
+
"grad_norm": 1.4292715251283548,
|
| 513 |
+
"learning_rate": 8.930131235066914e-06,
|
| 514 |
+
"loss": 0.7148,
|
| 515 |
+
"step": 710
|
| 516 |
+
},
|
| 517 |
+
{
|
| 518 |
+
"epoch": 0.8872458410351202,
|
| 519 |
+
"grad_norm": 1.6096597940014286,
|
| 520 |
+
"learning_rate": 8.885429140473361e-06,
|
| 521 |
+
"loss": 0.693,
|
| 522 |
+
"step": 720
|
| 523 |
+
},
|
| 524 |
+
{
|
| 525 |
+
"epoch": 0.8995686999383857,
|
| 526 |
+
"grad_norm": 1.353744510033121,
|
| 527 |
+
"learning_rate": 8.839928958938364e-06,
|
| 528 |
+
"loss": 0.7023,
|
| 529 |
+
"step": 730
|
| 530 |
+
},
|
| 531 |
+
{
|
| 532 |
+
"epoch": 0.9118915588416513,
|
| 533 |
+
"grad_norm": 1.4213815672007308,
|
| 534 |
+
"learning_rate": 8.793640036431036e-06,
|
| 535 |
+
"loss": 0.7014,
|
| 536 |
+
"step": 740
|
| 537 |
+
},
|
| 538 |
+
{
|
| 539 |
+
"epoch": 0.9242144177449169,
|
| 540 |
+
"grad_norm": 1.4606321089003083,
|
| 541 |
+
"learning_rate": 8.746571880931896e-06,
|
| 542 |
+
"loss": 0.7153,
|
| 543 |
+
"step": 750
|
| 544 |
+
},
|
| 545 |
+
{
|
| 546 |
+
"epoch": 0.9365372766481824,
|
| 547 |
+
"grad_norm": 1.5281040298725672,
|
| 548 |
+
"learning_rate": 8.698734160479892e-06,
|
| 549 |
+
"loss": 0.7031,
|
| 550 |
+
"step": 760
|
| 551 |
+
},
|
| 552 |
+
{
|
| 553 |
+
"epoch": 0.9488601355514479,
|
| 554 |
+
"grad_norm": 1.3884649680592567,
|
| 555 |
+
"learning_rate": 8.650136701186537e-06,
|
| 556 |
+
"loss": 0.6744,
|
| 557 |
+
"step": 770
|
| 558 |
+
},
|
| 559 |
+
{
|
| 560 |
+
"epoch": 0.9611829944547134,
|
| 561 |
+
"grad_norm": 1.4546984091832766,
|
| 562 |
+
"learning_rate": 8.60078948521757e-06,
|
| 563 |
+
"loss": 0.6921,
|
| 564 |
+
"step": 780
|
| 565 |
+
},
|
| 566 |
+
{
|
| 567 |
+
"epoch": 0.973505853357979,
|
| 568 |
+
"grad_norm": 1.407393171546671,
|
| 569 |
+
"learning_rate": 8.550702648742566e-06,
|
| 570 |
+
"loss": 0.6956,
|
| 571 |
+
"step": 790
|
| 572 |
+
},
|
| 573 |
+
{
|
| 574 |
+
"epoch": 0.9858287122612446,
|
| 575 |
+
"grad_norm": 1.5419052598370024,
|
| 576 |
+
"learning_rate": 8.499886479852935e-06,
|
| 577 |
+
"loss": 0.7018,
|
| 578 |
+
"step": 800
|
| 579 |
+
},
|
| 580 |
+
{
|
| 581 |
+
"epoch": 0.9981515711645101,
|
| 582 |
+
"grad_norm": 1.3924572032379763,
|
| 583 |
+
"learning_rate": 8.448351416448664e-06,
|
| 584 |
+
"loss": 0.7032,
|
| 585 |
+
"step": 810
|
| 586 |
+
},
|
| 587 |
+
{
|
| 588 |
+
"epoch": 1.0098582871226125,
|
| 589 |
+
"grad_norm": 1.469436463403791,
|
| 590 |
+
"learning_rate": 8.39610804409435e-06,
|
| 591 |
+
"loss": 0.595,
|
| 592 |
+
"step": 820
|
| 593 |
+
},
|
| 594 |
+
{
|
| 595 |
+
"epoch": 1.022181146025878,
|
| 596 |
+
"grad_norm": 1.3472647409260032,
|
| 597 |
+
"learning_rate": 8.343167093844847e-06,
|
| 598 |
+
"loss": 0.5946,
|
| 599 |
+
"step": 830
|
| 600 |
+
},
|
| 601 |
+
{
|
| 602 |
+
"epoch": 1.0345040049291436,
|
| 603 |
+
"grad_norm": 1.4900881715555114,
|
| 604 |
+
"learning_rate": 8.289539440041066e-06,
|
| 605 |
+
"loss": 0.5845,
|
| 606 |
+
"step": 840
|
| 607 |
+
},
|
| 608 |
+
{
|
| 609 |
+
"epoch": 1.0468268638324092,
|
| 610 |
+
"grad_norm": 1.536018069046092,
|
| 611 |
+
"learning_rate": 8.23523609807633e-06,
|
| 612 |
+
"loss": 0.584,
|
| 613 |
+
"step": 850
|
| 614 |
+
},
|
| 615 |
+
{
|
| 616 |
+
"epoch": 1.0591497227356748,
|
| 617 |
+
"grad_norm": 1.3929504918120152,
|
| 618 |
+
"learning_rate": 8.180268222133748e-06,
|
| 619 |
+
"loss": 0.5847,
|
| 620 |
+
"step": 860
|
| 621 |
+
},
|
| 622 |
+
{
|
| 623 |
+
"epoch": 1.0714725816389403,
|
| 624 |
+
"grad_norm": 1.4359269869911984,
|
| 625 |
+
"learning_rate": 8.124647102895098e-06,
|
| 626 |
+
"loss": 0.5925,
|
| 627 |
+
"step": 870
|
| 628 |
+
},
|
| 629 |
+
{
|
| 630 |
+
"epoch": 1.083795440542206,
|
| 631 |
+
"grad_norm": 1.5544330412650198,
|
| 632 |
+
"learning_rate": 8.068384165221657e-06,
|
| 633 |
+
"loss": 0.584,
|
| 634 |
+
"step": 880
|
| 635 |
+
},
|
| 636 |
+
{
|
| 637 |
+
"epoch": 1.0961182994454712,
|
| 638 |
+
"grad_norm": 1.5891555387420517,
|
| 639 |
+
"learning_rate": 8.011490965807479e-06,
|
| 640 |
+
"loss": 0.5932,
|
| 641 |
+
"step": 890
|
| 642 |
+
},
|
| 643 |
+
{
|
| 644 |
+
"epoch": 1.1084411583487368,
|
| 645 |
+
"grad_norm": 1.3831550373033548,
|
| 646 |
+
"learning_rate": 7.953979190805587e-06,
|
| 647 |
+
"loss": 0.5642,
|
| 648 |
+
"step": 900
|
| 649 |
+
},
|
| 650 |
+
{
|
| 651 |
+
"epoch": 1.1207640172520024,
|
| 652 |
+
"grad_norm": 1.5096251669733274,
|
| 653 |
+
"learning_rate": 7.89586065342759e-06,
|
| 654 |
+
"loss": 0.6045,
|
| 655 |
+
"step": 910
|
| 656 |
+
},
|
| 657 |
+
{
|
| 658 |
+
"epoch": 1.133086876155268,
|
| 659 |
+
"grad_norm": 1.5477962613440504,
|
| 660 |
+
"learning_rate": 7.837147291517172e-06,
|
| 661 |
+
"loss": 0.5985,
|
| 662 |
+
"step": 920
|
| 663 |
+
},
|
| 664 |
+
{
|
| 665 |
+
"epoch": 1.1454097350585335,
|
| 666 |
+
"grad_norm": 1.4777678800069058,
|
| 667 |
+
"learning_rate": 7.777851165098012e-06,
|
| 668 |
+
"loss": 0.5686,
|
| 669 |
+
"step": 930
|
| 670 |
+
},
|
| 671 |
+
{
|
| 672 |
+
"epoch": 1.157732593961799,
|
| 673 |
+
"grad_norm": 1.4642650138802815,
|
| 674 |
+
"learning_rate": 7.717984453896585e-06,
|
| 675 |
+
"loss": 0.5888,
|
| 676 |
+
"step": 940
|
| 677 |
+
},
|
| 678 |
+
{
|
| 679 |
+
"epoch": 1.1700554528650646,
|
| 680 |
+
"grad_norm": 1.4714962123924216,
|
| 681 |
+
"learning_rate": 7.657559454840386e-06,
|
| 682 |
+
"loss": 0.5813,
|
| 683 |
+
"step": 950
|
| 684 |
+
},
|
| 685 |
+
{
|
| 686 |
+
"epoch": 1.1823783117683302,
|
| 687 |
+
"grad_norm": 1.5434418352328334,
|
| 688 |
+
"learning_rate": 7.596588579532087e-06,
|
| 689 |
+
"loss": 0.582,
|
| 690 |
+
"step": 960
|
| 691 |
+
},
|
| 692 |
+
{
|
| 693 |
+
"epoch": 1.1947011706715958,
|
| 694 |
+
"grad_norm": 1.5260417649961813,
|
| 695 |
+
"learning_rate": 7.535084351700117e-06,
|
| 696 |
+
"loss": 0.5855,
|
| 697 |
+
"step": 970
|
| 698 |
+
},
|
| 699 |
+
{
|
| 700 |
+
"epoch": 1.2070240295748613,
|
| 701 |
+
"grad_norm": 1.4802498809379745,
|
| 702 |
+
"learning_rate": 7.473059404626229e-06,
|
| 703 |
+
"loss": 0.5842,
|
| 704 |
+
"step": 980
|
| 705 |
+
},
|
| 706 |
+
{
|
| 707 |
+
"epoch": 1.219346888478127,
|
| 708 |
+
"grad_norm": 1.3297021246491412,
|
| 709 |
+
"learning_rate": 7.410526478550568e-06,
|
| 710 |
+
"loss": 0.5927,
|
| 711 |
+
"step": 990
|
| 712 |
+
},
|
| 713 |
+
{
|
| 714 |
+
"epoch": 1.2316697473813925,
|
| 715 |
+
"grad_norm": 1.4822236798755983,
|
| 716 |
+
"learning_rate": 7.34749841805475e-06,
|
| 717 |
+
"loss": 0.5925,
|
| 718 |
+
"step": 1000
|
| 719 |
+
},
|
| 720 |
+
{
|
| 721 |
+
"epoch": 1.2316697473813925,
|
| 722 |
+
"eval_loss": 0.6842279434204102,
|
| 723 |
+
"eval_runtime": 292.4878,
|
| 724 |
+
"eval_samples_per_second": 19.724,
|
| 725 |
+
"eval_steps_per_second": 2.468,
|
| 726 |
+
"step": 1000
|
| 727 |
+
},
|
| 728 |
+
{
|
| 729 |
+
"epoch": 1.243992606284658,
|
| 730 |
+
"grad_norm": 1.4439342982568775,
|
| 731 |
+
"learning_rate": 7.283988169423526e-06,
|
| 732 |
+
"loss": 0.6095,
|
| 733 |
+
"step": 1010
|
| 734 |
+
},
|
| 735 |
+
{
|
| 736 |
+
"epoch": 1.2563154651879236,
|
| 737 |
+
"grad_norm": 1.4717143885943422,
|
| 738 |
+
"learning_rate": 7.2200087779855435e-06,
|
| 739 |
+
"loss": 0.5953,
|
| 740 |
+
"step": 1020
|
| 741 |
+
},
|
| 742 |
+
{
|
| 743 |
+
"epoch": 1.2686383240911892,
|
| 744 |
+
"grad_norm": 1.3795289662769703,
|
| 745 |
+
"learning_rate": 7.155573385433772e-06,
|
| 746 |
+
"loss": 0.5913,
|
| 747 |
+
"step": 1030
|
| 748 |
+
},
|
| 749 |
+
{
|
| 750 |
+
"epoch": 1.2809611829944547,
|
| 751 |
+
"grad_norm": 1.4524123662025874,
|
| 752 |
+
"learning_rate": 7.090695227126141e-06,
|
| 753 |
+
"loss": 0.6022,
|
| 754 |
+
"step": 1040
|
| 755 |
+
},
|
| 756 |
+
{
|
| 757 |
+
"epoch": 1.2932840418977203,
|
| 758 |
+
"grad_norm": 1.490623557265845,
|
| 759 |
+
"learning_rate": 7.025387629366912e-06,
|
| 760 |
+
"loss": 0.5891,
|
| 761 |
+
"step": 1050
|
| 762 |
+
},
|
| 763 |
+
{
|
| 764 |
+
"epoch": 1.3056069008009858,
|
| 765 |
+
"grad_norm": 1.531349932144773,
|
| 766 |
+
"learning_rate": 6.959664006669404e-06,
|
| 767 |
+
"loss": 0.578,
|
| 768 |
+
"step": 1060
|
| 769 |
+
},
|
| 770 |
+
{
|
| 771 |
+
"epoch": 1.3179297597042514,
|
| 772 |
+
"grad_norm": 1.530526603204912,
|
| 773 |
+
"learning_rate": 6.893537859000576e-06,
|
| 774 |
+
"loss": 0.5837,
|
| 775 |
+
"step": 1070
|
| 776 |
+
},
|
| 777 |
+
{
|
| 778 |
+
"epoch": 1.330252618607517,
|
| 779 |
+
"grad_norm": 1.5336378586039672,
|
| 780 |
+
"learning_rate": 6.827022769008068e-06,
|
| 781 |
+
"loss": 0.5783,
|
| 782 |
+
"step": 1080
|
| 783 |
+
},
|
| 784 |
+
{
|
| 785 |
+
"epoch": 1.3425754775107825,
|
| 786 |
+
"grad_norm": 1.5280710127320882,
|
| 787 |
+
"learning_rate": 6.7601323992302525e-06,
|
| 788 |
+
"loss": 0.5882,
|
| 789 |
+
"step": 1090
|
| 790 |
+
},
|
| 791 |
+
{
|
| 792 |
+
"epoch": 1.354898336414048,
|
| 793 |
+
"grad_norm": 1.4999675248496456,
|
| 794 |
+
"learning_rate": 6.692880489289885e-06,
|
| 795 |
+
"loss": 0.5889,
|
| 796 |
+
"step": 1100
|
| 797 |
+
},
|
| 798 |
+
{
|
| 799 |
+
"epoch": 1.3672211953173137,
|
| 800 |
+
"grad_norm": 1.5626649470550829,
|
| 801 |
+
"learning_rate": 6.6252808530719095e-06,
|
| 802 |
+
"loss": 0.5886,
|
| 803 |
+
"step": 1110
|
| 804 |
+
},
|
| 805 |
+
{
|
| 806 |
+
"epoch": 1.3795440542205792,
|
| 807 |
+
"grad_norm": 1.284249388975707,
|
| 808 |
+
"learning_rate": 6.557347375886022e-06,
|
| 809 |
+
"loss": 0.5737,
|
| 810 |
+
"step": 1120
|
| 811 |
+
},
|
| 812 |
+
{
|
| 813 |
+
"epoch": 1.3918669131238448,
|
| 814 |
+
"grad_norm": 1.5307650624586169,
|
| 815 |
+
"learning_rate": 6.489094011614553e-06,
|
| 816 |
+
"loss": 0.5876,
|
| 817 |
+
"step": 1130
|
| 818 |
+
},
|
| 819 |
+
{
|
| 820 |
+
"epoch": 1.4041897720271104,
|
| 821 |
+
"grad_norm": 1.5439204209276158,
|
| 822 |
+
"learning_rate": 6.4205347798462704e-06,
|
| 823 |
+
"loss": 0.5917,
|
| 824 |
+
"step": 1140
|
| 825 |
+
},
|
| 826 |
+
{
|
| 827 |
+
"epoch": 1.416512630930376,
|
| 828 |
+
"grad_norm": 1.4449420830490325,
|
| 829 |
+
"learning_rate": 6.351683762996681e-06,
|
| 830 |
+
"loss": 0.5836,
|
| 831 |
+
"step": 1150
|
| 832 |
+
},
|
| 833 |
+
{
|
| 834 |
+
"epoch": 1.4288354898336415,
|
| 835 |
+
"grad_norm": 1.3380998089412088,
|
| 836 |
+
"learning_rate": 6.282555103415438e-06,
|
| 837 |
+
"loss": 0.5783,
|
| 838 |
+
"step": 1160
|
| 839 |
+
},
|
| 840 |
+
{
|
| 841 |
+
"epoch": 1.441158348736907,
|
| 842 |
+
"grad_norm": 1.5189010531540565,
|
| 843 |
+
"learning_rate": 6.213163000481428e-06,
|
| 844 |
+
"loss": 0.5723,
|
| 845 |
+
"step": 1170
|
| 846 |
+
},
|
| 847 |
+
{
|
| 848 |
+
"epoch": 1.4534812076401726,
|
| 849 |
+
"grad_norm": 1.4395684252424352,
|
| 850 |
+
"learning_rate": 6.143521707686137e-06,
|
| 851 |
+
"loss": 0.6022,
|
| 852 |
+
"step": 1180
|
| 853 |
+
},
|
| 854 |
+
{
|
| 855 |
+
"epoch": 1.4658040665434382,
|
| 856 |
+
"grad_norm": 1.4792406947504433,
|
| 857 |
+
"learning_rate": 6.073645529705926e-06,
|
| 858 |
+
"loss": 0.5917,
|
| 859 |
+
"step": 1190
|
| 860 |
+
},
|
| 861 |
+
{
|
| 862 |
+
"epoch": 1.4781269254467038,
|
| 863 |
+
"grad_norm": 1.3975340017935896,
|
| 864 |
+
"learning_rate": 6.0035488194637645e-06,
|
| 865 |
+
"loss": 0.5833,
|
| 866 |
+
"step": 1200
|
| 867 |
+
},
|
| 868 |
+
{
|
| 869 |
+
"epoch": 1.4904497843499693,
|
| 870 |
+
"grad_norm": 1.3964019973704387,
|
| 871 |
+
"learning_rate": 5.933245975181074e-06,
|
| 872 |
+
"loss": 0.5898,
|
| 873 |
+
"step": 1210
|
| 874 |
+
},
|
| 875 |
+
{
|
| 876 |
+
"epoch": 1.502772643253235,
|
| 877 |
+
"grad_norm": 1.4887675284639827,
|
| 878 |
+
"learning_rate": 5.8627514374202596e-06,
|
| 879 |
+
"loss": 0.5842,
|
| 880 |
+
"step": 1220
|
| 881 |
+
},
|
| 882 |
+
{
|
| 883 |
+
"epoch": 1.5150955021565005,
|
| 884 |
+
"grad_norm": 1.5230001110062263,
|
| 885 |
+
"learning_rate": 5.79207968611854e-06,
|
| 886 |
+
"loss": 0.5839,
|
| 887 |
+
"step": 1230
|
| 888 |
+
},
|
| 889 |
+
{
|
| 890 |
+
"epoch": 1.527418361059766,
|
| 891 |
+
"grad_norm": 1.6079713389994916,
|
| 892 |
+
"learning_rate": 5.721245237613704e-06,
|
| 893 |
+
"loss": 0.5764,
|
| 894 |
+
"step": 1240
|
| 895 |
+
},
|
| 896 |
+
{
|
| 897 |
+
"epoch": 1.5397412199630314,
|
| 898 |
+
"grad_norm": 1.4413876513489634,
|
| 899 |
+
"learning_rate": 5.650262641662367e-06,
|
| 900 |
+
"loss": 0.586,
|
| 901 |
+
"step": 1250
|
| 902 |
+
},
|
| 903 |
+
{
|
| 904 |
+
"epoch": 1.552064078866297,
|
| 905 |
+
"grad_norm": 1.3881515755792924,
|
| 906 |
+
"learning_rate": 5.5791464784513905e-06,
|
| 907 |
+
"loss": 0.5702,
|
| 908 |
+
"step": 1260
|
| 909 |
+
},
|
| 910 |
+
{
|
| 911 |
+
"epoch": 1.5643869377695625,
|
| 912 |
+
"grad_norm": 1.4305380922203093,
|
| 913 |
+
"learning_rate": 5.50791135560303e-06,
|
| 914 |
+
"loss": 0.5833,
|
| 915 |
+
"step": 1270
|
| 916 |
+
},
|
| 917 |
+
{
|
| 918 |
+
"epoch": 1.576709796672828,
|
| 919 |
+
"grad_norm": 1.5745707819252874,
|
| 920 |
+
"learning_rate": 5.4365719051744556e-06,
|
| 921 |
+
"loss": 0.5738,
|
| 922 |
+
"step": 1280
|
| 923 |
+
},
|
| 924 |
+
{
|
| 925 |
+
"epoch": 1.5890326555760936,
|
| 926 |
+
"grad_norm": 1.5326630255795084,
|
| 927 |
+
"learning_rate": 5.365142780652255e-06,
|
| 928 |
+
"loss": 0.5811,
|
| 929 |
+
"step": 1290
|
| 930 |
+
},
|
| 931 |
+
{
|
| 932 |
+
"epoch": 1.6013555144793592,
|
| 933 |
+
"grad_norm": 1.4817407233445945,
|
| 934 |
+
"learning_rate": 5.2936386539425325e-06,
|
| 935 |
+
"loss": 0.6035,
|
| 936 |
+
"step": 1300
|
| 937 |
+
},
|
| 938 |
+
{
|
| 939 |
+
"epoch": 1.6136783733826248,
|
| 940 |
+
"grad_norm": 1.5091402086971266,
|
| 941 |
+
"learning_rate": 5.222074212357221e-06,
|
| 942 |
+
"loss": 0.5767,
|
| 943 |
+
"step": 1310
|
| 944 |
+
},
|
| 945 |
+
{
|
| 946 |
+
"epoch": 1.6260012322858903,
|
| 947 |
+
"grad_norm": 1.3581090409841263,
|
| 948 |
+
"learning_rate": 5.150464155597239e-06,
|
| 949 |
+
"loss": 0.5798,
|
| 950 |
+
"step": 1320
|
| 951 |
+
},
|
| 952 |
+
{
|
| 953 |
+
"epoch": 1.638324091189156,
|
| 954 |
+
"grad_norm": 1.530635775774838,
|
| 955 |
+
"learning_rate": 5.0788231927330924e-06,
|
| 956 |
+
"loss": 0.5757,
|
| 957 |
+
"step": 1330
|
| 958 |
+
},
|
| 959 |
+
{
|
| 960 |
+
"epoch": 1.6506469500924215,
|
| 961 |
+
"grad_norm": 1.4118546219588066,
|
| 962 |
+
"learning_rate": 5.007166039183561e-06,
|
| 963 |
+
"loss": 0.5762,
|
| 964 |
+
"step": 1340
|
| 965 |
+
},
|
| 966 |
+
{
|
| 967 |
+
"epoch": 1.662969808995687,
|
| 968 |
+
"grad_norm": 1.4363769831225472,
|
| 969 |
+
"learning_rate": 4.935507413693071e-06,
|
| 970 |
+
"loss": 0.5758,
|
| 971 |
+
"step": 1350
|
| 972 |
+
},
|
| 973 |
+
{
|
| 974 |
+
"epoch": 1.6752926678989526,
|
| 975 |
+
"grad_norm": 1.4406194256672695,
|
| 976 |
+
"learning_rate": 4.863862035308392e-06,
|
| 977 |
+
"loss": 0.5727,
|
| 978 |
+
"step": 1360
|
| 979 |
+
},
|
| 980 |
+
{
|
| 981 |
+
"epoch": 1.6876155268022182,
|
| 982 |
+
"grad_norm": 1.450317318281763,
|
| 983 |
+
"learning_rate": 4.792244620355275e-06,
|
| 984 |
+
"loss": 0.5723,
|
| 985 |
+
"step": 1370
|
| 986 |
+
},
|
| 987 |
+
{
|
| 988 |
+
"epoch": 1.6999383857054837,
|
| 989 |
+
"grad_norm": 1.3072933545935408,
|
| 990 |
+
"learning_rate": 4.720669879415637e-06,
|
| 991 |
+
"loss": 0.58,
|
| 992 |
+
"step": 1380
|
| 993 |
+
},
|
| 994 |
+
{
|
| 995 |
+
"epoch": 1.712261244608749,
|
| 996 |
+
"grad_norm": 1.4301648378358092,
|
| 997 |
+
"learning_rate": 4.649152514305934e-06,
|
| 998 |
+
"loss": 0.5831,
|
| 999 |
+
"step": 1390
|
| 1000 |
+
},
|
| 1001 |
+
{
|
| 1002 |
+
"epoch": 1.7245841035120146,
|
| 1003 |
+
"grad_norm": 1.446398014421451,
|
| 1004 |
+
"learning_rate": 4.5777072150573355e-06,
|
| 1005 |
+
"loss": 0.5647,
|
| 1006 |
+
"step": 1400
|
| 1007 |
+
},
|
| 1008 |
+
{
|
| 1009 |
+
"epoch": 1.7369069624152802,
|
| 1010 |
+
"grad_norm": 1.335488005255364,
|
| 1011 |
+
"learning_rate": 4.506348656898316e-06,
|
| 1012 |
+
"loss": 0.5631,
|
| 1013 |
+
"step": 1410
|
| 1014 |
+
},
|
| 1015 |
+
{
|
| 1016 |
+
"epoch": 1.7492298213185458,
|
| 1017 |
+
"grad_norm": 1.4887079476746525,
|
| 1018 |
+
"learning_rate": 4.435091497240287e-06,
|
| 1019 |
+
"loss": 0.5855,
|
| 1020 |
+
"step": 1420
|
| 1021 |
+
},
|
| 1022 |
+
{
|
| 1023 |
+
"epoch": 1.7615526802218113,
|
| 1024 |
+
"grad_norm": 1.3528687038903755,
|
| 1025 |
+
"learning_rate": 4.363950372666896e-06,
|
| 1026 |
+
"loss": 0.5714,
|
| 1027 |
+
"step": 1430
|
| 1028 |
+
},
|
| 1029 |
+
{
|
| 1030 |
+
"epoch": 1.773875539125077,
|
| 1031 |
+
"grad_norm": 1.5010430485235071,
|
| 1032 |
+
"learning_rate": 4.292939895927587e-06,
|
| 1033 |
+
"loss": 0.5754,
|
| 1034 |
+
"step": 1440
|
| 1035 |
+
},
|
| 1036 |
+
{
|
| 1037 |
+
"epoch": 1.7861983980283425,
|
| 1038 |
+
"grad_norm": 1.4967607376486216,
|
| 1039 |
+
"learning_rate": 4.2220746529360745e-06,
|
| 1040 |
+
"loss": 0.5876,
|
| 1041 |
+
"step": 1450
|
| 1042 |
+
},
|
| 1043 |
+
{
|
| 1044 |
+
"epoch": 1.798521256931608,
|
| 1045 |
+
"grad_norm": 1.414501545862856,
|
| 1046 |
+
"learning_rate": 4.151369199774325e-06,
|
| 1047 |
+
"loss": 0.5787,
|
| 1048 |
+
"step": 1460
|
| 1049 |
+
},
|
| 1050 |
+
{
|
| 1051 |
+
"epoch": 1.8108441158348736,
|
| 1052 |
+
"grad_norm": 1.4353512720205093,
|
| 1053 |
+
"learning_rate": 4.080838059702656e-06,
|
| 1054 |
+
"loss": 0.5798,
|
| 1055 |
+
"step": 1470
|
| 1056 |
+
},
|
| 1057 |
+
{
|
| 1058 |
+
"epoch": 1.8231669747381392,
|
| 1059 |
+
"grad_norm": 1.4304133370737042,
|
| 1060 |
+
"learning_rate": 4.0104957201765874e-06,
|
| 1061 |
+
"loss": 0.5813,
|
| 1062 |
+
"step": 1480
|
| 1063 |
+
},
|
| 1064 |
+
{
|
| 1065 |
+
"epoch": 1.8354898336414047,
|
| 1066 |
+
"grad_norm": 1.4767308440844067,
|
| 1067 |
+
"learning_rate": 3.940356629871051e-06,
|
| 1068 |
+
"loss": 0.5737,
|
| 1069 |
+
"step": 1490
|
| 1070 |
+
},
|
| 1071 |
+
{
|
| 1072 |
+
"epoch": 1.8478126925446703,
|
| 1073 |
+
"grad_norm": 1.4076636748109828,
|
| 1074 |
+
"learning_rate": 3.870435195712547e-06,
|
| 1075 |
+
"loss": 0.5739,
|
| 1076 |
+
"step": 1500
|
| 1077 |
+
},
|
| 1078 |
+
{
|
| 1079 |
+
"epoch": 1.8478126925446703,
|
| 1080 |
+
"eval_loss": 0.6650247573852539,
|
| 1081 |
+
"eval_runtime": 293.5128,
|
| 1082 |
+
"eval_samples_per_second": 19.655,
|
| 1083 |
+
"eval_steps_per_second": 2.46,
|
| 1084 |
+
"step": 1500
|
| 1085 |
+
},
|
| 1086 |
+
{
|
| 1087 |
+
"epoch": 1.8601355514479359,
|
| 1088 |
+
"grad_norm": 1.446069322241234,
|
| 1089 |
+
"learning_rate": 3.8007457799198977e-06,
|
| 1090 |
+
"loss": 0.5996,
|
| 1091 |
+
"step": 1510
|
| 1092 |
+
},
|
| 1093 |
+
{
|
| 1094 |
+
"epoch": 1.8724584103512014,
|
| 1095 |
+
"grad_norm": 1.461756871051999,
|
| 1096 |
+
"learning_rate": 3.7313026970541687e-06,
|
| 1097 |
+
"loss": 0.5686,
|
| 1098 |
+
"step": 1520
|
| 1099 |
+
},
|
| 1100 |
+
{
|
| 1101 |
+
"epoch": 1.884781269254467,
|
| 1102 |
+
"grad_norm": 1.3839934073401108,
|
| 1103 |
+
"learning_rate": 3.662120211078385e-06,
|
| 1104 |
+
"loss": 0.5814,
|
| 1105 |
+
"step": 1530
|
| 1106 |
+
},
|
| 1107 |
+
{
|
| 1108 |
+
"epoch": 1.8971041281577325,
|
| 1109 |
+
"grad_norm": 1.430323877887507,
|
| 1110 |
+
"learning_rate": 3.5932125324276524e-06,
|
| 1111 |
+
"loss": 0.5821,
|
| 1112 |
+
"step": 1540
|
| 1113 |
+
},
|
| 1114 |
+
{
|
| 1115 |
+
"epoch": 1.9094269870609981,
|
| 1116 |
+
"grad_norm": 1.547047373146706,
|
| 1117 |
+
"learning_rate": 3.524593815090241e-06,
|
| 1118 |
+
"loss": 0.5753,
|
| 1119 |
+
"step": 1550
|
| 1120 |
+
},
|
| 1121 |
+
{
|
| 1122 |
+
"epoch": 1.9217498459642637,
|
| 1123 |
+
"grad_norm": 1.4286065771891827,
|
| 1124 |
+
"learning_rate": 3.4562781537003e-06,
|
| 1125 |
+
"loss": 0.5659,
|
| 1126 |
+
"step": 1560
|
| 1127 |
+
},
|
| 1128 |
+
{
|
| 1129 |
+
"epoch": 1.9340727048675292,
|
| 1130 |
+
"grad_norm": 1.489813539203223,
|
| 1131 |
+
"learning_rate": 3.3882795806427437e-06,
|
| 1132 |
+
"loss": 0.5705,
|
| 1133 |
+
"step": 1570
|
| 1134 |
+
},
|
| 1135 |
+
{
|
| 1136 |
+
"epoch": 1.9463955637707948,
|
| 1137 |
+
"grad_norm": 1.3797109203901303,
|
| 1138 |
+
"learning_rate": 3.320612063170926e-06,
|
| 1139 |
+
"loss": 0.573,
|
| 1140 |
+
"step": 1580
|
| 1141 |
+
},
|
| 1142 |
+
{
|
| 1143 |
+
"epoch": 1.9587184226740604,
|
| 1144 |
+
"grad_norm": 1.424911376888776,
|
| 1145 |
+
"learning_rate": 3.2532895005376943e-06,
|
| 1146 |
+
"loss": 0.5594,
|
| 1147 |
+
"step": 1590
|
| 1148 |
+
},
|
| 1149 |
+
{
|
| 1150 |
+
"epoch": 1.971041281577326,
|
| 1151 |
+
"grad_norm": 1.4893262359299044,
|
| 1152 |
+
"learning_rate": 3.18632572114042e-06,
|
| 1153 |
+
"loss": 0.5682,
|
| 1154 |
+
"step": 1600
|
| 1155 |
+
},
|
| 1156 |
+
{
|
| 1157 |
+
"epoch": 1.9833641404805915,
|
| 1158 |
+
"grad_norm": 1.5364372251837122,
|
| 1159 |
+
"learning_rate": 3.1197344796805675e-06,
|
| 1160 |
+
"loss": 0.5642,
|
| 1161 |
+
"step": 1610
|
| 1162 |
+
},
|
| 1163 |
+
{
|
| 1164 |
+
"epoch": 1.995686999383857,
|
| 1165 |
+
"grad_norm": 1.3917689573647591,
|
| 1166 |
+
"learning_rate": 3.0535294543384074e-06,
|
| 1167 |
+
"loss": 0.5691,
|
| 1168 |
+
"step": 1620
|
| 1169 |
+
},
|
| 1170 |
+
{
|
| 1171 |
+
"epoch": 2.0073937153419594,
|
| 1172 |
+
"grad_norm": 1.2335683413193184,
|
| 1173 |
+
"learning_rate": 2.987724243963458e-06,
|
| 1174 |
+
"loss": 0.4903,
|
| 1175 |
+
"step": 1630
|
| 1176 |
+
},
|
| 1177 |
+
{
|
| 1178 |
+
"epoch": 2.019716574245225,
|
| 1179 |
+
"grad_norm": 1.4879923707562226,
|
| 1180 |
+
"learning_rate": 2.922332365281201e-06,
|
| 1181 |
+
"loss": 0.4882,
|
| 1182 |
+
"step": 1640
|
| 1183 |
+
},
|
| 1184 |
+
{
|
| 1185 |
+
"epoch": 2.0320394331484906,
|
| 1186 |
+
"grad_norm": 1.470082730902103,
|
| 1187 |
+
"learning_rate": 2.857367250116682e-06,
|
| 1188 |
+
"loss": 0.4714,
|
| 1189 |
+
"step": 1650
|
| 1190 |
+
},
|
| 1191 |
+
{
|
| 1192 |
+
"epoch": 2.044362292051756,
|
| 1193 |
+
"grad_norm": 1.3868059292330677,
|
| 1194 |
+
"learning_rate": 2.7928422426355554e-06,
|
| 1195 |
+
"loss": 0.4748,
|
| 1196 |
+
"step": 1660
|
| 1197 |
+
},
|
| 1198 |
+
{
|
| 1199 |
+
"epoch": 2.0566851509550217,
|
| 1200 |
+
"grad_norm": 1.3074599768704374,
|
| 1201 |
+
"learning_rate": 2.728770596603105e-06,
|
| 1202 |
+
"loss": 0.4764,
|
| 1203 |
+
"step": 1670
|
| 1204 |
+
},
|
| 1205 |
+
{
|
| 1206 |
+
"epoch": 2.0690080098582873,
|
| 1207 |
+
"grad_norm": 1.32948215843512,
|
| 1208 |
+
"learning_rate": 2.665165472661866e-06,
|
| 1209 |
+
"loss": 0.4735,
|
| 1210 |
+
"step": 1680
|
| 1211 |
+
},
|
| 1212 |
+
{
|
| 1213 |
+
"epoch": 2.081330868761553,
|
| 1214 |
+
"grad_norm": 1.288105917381929,
|
| 1215 |
+
"learning_rate": 2.6020399356283586e-06,
|
| 1216 |
+
"loss": 0.4578,
|
| 1217 |
+
"step": 1690
|
| 1218 |
+
},
|
| 1219 |
+
{
|
| 1220 |
+
"epoch": 2.0936537276648184,
|
| 1221 |
+
"grad_norm": 1.5083815524530002,
|
| 1222 |
+
"learning_rate": 2.539406951809512e-06,
|
| 1223 |
+
"loss": 0.4737,
|
| 1224 |
+
"step": 1700
|
| 1225 |
+
},
|
| 1226 |
+
{
|
| 1227 |
+
"epoch": 2.105976586568084,
|
| 1228 |
+
"grad_norm": 1.370643565971273,
|
| 1229 |
+
"learning_rate": 2.477279386339309e-06,
|
| 1230 |
+
"loss": 0.4695,
|
| 1231 |
+
"step": 1710
|
| 1232 |
+
},
|
| 1233 |
+
{
|
| 1234 |
+
"epoch": 2.1182994454713495,
|
| 1235 |
+
"grad_norm": 1.539465680425723,
|
| 1236 |
+
"learning_rate": 2.4156700005362384e-06,
|
| 1237 |
+
"loss": 0.4785,
|
| 1238 |
+
"step": 1720
|
| 1239 |
+
},
|
| 1240 |
+
{
|
| 1241 |
+
"epoch": 2.130622304374615,
|
| 1242 |
+
"grad_norm": 1.3161398723417894,
|
| 1243 |
+
"learning_rate": 2.3545914492820366e-06,
|
| 1244 |
+
"loss": 0.4723,
|
| 1245 |
+
"step": 1730
|
| 1246 |
+
},
|
| 1247 |
+
{
|
| 1248 |
+
"epoch": 2.1429451632778806,
|
| 1249 |
+
"grad_norm": 1.3496232354886113,
|
| 1250 |
+
"learning_rate": 2.2940562784223224e-06,
|
| 1251 |
+
"loss": 0.4672,
|
| 1252 |
+
"step": 1740
|
| 1253 |
+
},
|
| 1254 |
+
{
|
| 1255 |
+
"epoch": 2.155268022181146,
|
| 1256 |
+
"grad_norm": 1.4560762054385992,
|
| 1257 |
+
"learning_rate": 2.234076922189613e-06,
|
| 1258 |
+
"loss": 0.4734,
|
| 1259 |
+
"step": 1750
|
| 1260 |
+
},
|
| 1261 |
+
{
|
| 1262 |
+
"epoch": 2.167590881084412,
|
| 1263 |
+
"grad_norm": 1.4261953514874182,
|
| 1264 |
+
"learning_rate": 2.174665700649267e-06,
|
| 1265 |
+
"loss": 0.4512,
|
| 1266 |
+
"step": 1760
|
| 1267 |
+
},
|
| 1268 |
+
{
|
| 1269 |
+
"epoch": 2.1799137399876773,
|
| 1270 |
+
"grad_norm": 1.402399540780295,
|
| 1271 |
+
"learning_rate": 2.1158348171688888e-06,
|
| 1272 |
+
"loss": 0.4678,
|
| 1273 |
+
"step": 1770
|
| 1274 |
+
},
|
| 1275 |
+
{
|
| 1276 |
+
"epoch": 2.1922365988909425,
|
| 1277 |
+
"grad_norm": 1.5018053091348222,
|
| 1278 |
+
"learning_rate": 2.0575963559116823e-06,
|
| 1279 |
+
"loss": 0.4719,
|
| 1280 |
+
"step": 1780
|
| 1281 |
+
},
|
| 1282 |
+
{
|
| 1283 |
+
"epoch": 2.2045594577942085,
|
| 1284 |
+
"grad_norm": 1.3841680388808222,
|
| 1285 |
+
"learning_rate": 1.999962279354311e-06,
|
| 1286 |
+
"loss": 0.4539,
|
| 1287 |
+
"step": 1790
|
| 1288 |
+
},
|
| 1289 |
+
{
|
| 1290 |
+
"epoch": 2.2168823166974736,
|
| 1291 |
+
"grad_norm": 1.4988666932124328,
|
| 1292 |
+
"learning_rate": 1.942944425829741e-06,
|
| 1293 |
+
"loss": 0.4759,
|
| 1294 |
+
"step": 1800
|
| 1295 |
+
},
|
| 1296 |
+
{
|
| 1297 |
+
"epoch": 2.229205175600739,
|
| 1298 |
+
"grad_norm": 1.4604660741651032,
|
| 1299 |
+
"learning_rate": 1.8865545070955882e-06,
|
| 1300 |
+
"loss": 0.4633,
|
| 1301 |
+
"step": 1810
|
| 1302 |
+
},
|
| 1303 |
+
{
|
| 1304 |
+
"epoch": 2.2415280345040047,
|
| 1305 |
+
"grad_norm": 1.4729154256839592,
|
| 1306 |
+
"learning_rate": 1.8308041059284621e-06,
|
| 1307 |
+
"loss": 0.4683,
|
| 1308 |
+
"step": 1820
|
| 1309 |
+
},
|
| 1310 |
+
{
|
| 1311 |
+
"epoch": 2.2538508934072703,
|
| 1312 |
+
"grad_norm": 1.3951925274071466,
|
| 1313 |
+
"learning_rate": 1.775704673744809e-06,
|
| 1314 |
+
"loss": 0.477,
|
| 1315 |
+
"step": 1830
|
| 1316 |
+
},
|
| 1317 |
+
{
|
| 1318 |
+
"epoch": 2.266173752310536,
|
| 1319 |
+
"grad_norm": 1.3274198340736063,
|
| 1320 |
+
"learning_rate": 1.7212675282487269e-06,
|
| 1321 |
+
"loss": 0.4792,
|
| 1322 |
+
"step": 1840
|
| 1323 |
+
},
|
| 1324 |
+
{
|
| 1325 |
+
"epoch": 2.2784966112138014,
|
| 1326 |
+
"grad_norm": 1.47289795633966,
|
| 1327 |
+
"learning_rate": 1.6675038511072518e-06,
|
| 1328 |
+
"loss": 0.4704,
|
| 1329 |
+
"step": 1850
|
| 1330 |
+
},
|
| 1331 |
+
{
|
| 1332 |
+
"epoch": 2.290819470117067,
|
| 1333 |
+
"grad_norm": 1.464939828470514,
|
| 1334 |
+
"learning_rate": 1.6144246856535933e-06,
|
| 1335 |
+
"loss": 0.4685,
|
| 1336 |
+
"step": 1860
|
| 1337 |
+
},
|
| 1338 |
+
{
|
| 1339 |
+
"epoch": 2.3031423290203326,
|
| 1340 |
+
"grad_norm": 1.5114712361380727,
|
| 1341 |
+
"learning_rate": 1.5620409346187697e-06,
|
| 1342 |
+
"loss": 0.4786,
|
| 1343 |
+
"step": 1870
|
| 1344 |
+
},
|
| 1345 |
+
{
|
| 1346 |
+
"epoch": 2.315465187923598,
|
| 1347 |
+
"grad_norm": 1.470496061687908,
|
| 1348 |
+
"learning_rate": 1.510363357892133e-06,
|
| 1349 |
+
"loss": 0.469,
|
| 1350 |
+
"step": 1880
|
| 1351 |
+
},
|
| 1352 |
+
{
|
| 1353 |
+
"epoch": 2.3277880468268637,
|
| 1354 |
+
"grad_norm": 1.3693122919551104,
|
| 1355 |
+
"learning_rate": 1.4594025703112397e-06,
|
| 1356 |
+
"loss": 0.4706,
|
| 1357 |
+
"step": 1890
|
| 1358 |
+
},
|
| 1359 |
+
{
|
| 1360 |
+
"epoch": 2.3401109057301293,
|
| 1361 |
+
"grad_norm": 1.4248383166060672,
|
| 1362 |
+
"learning_rate": 1.4091690394814989e-06,
|
| 1363 |
+
"loss": 0.4586,
|
| 1364 |
+
"step": 1900
|
| 1365 |
+
},
|
| 1366 |
+
{
|
| 1367 |
+
"epoch": 2.352433764633395,
|
| 1368 |
+
"grad_norm": 1.3906892416250238,
|
| 1369 |
+
"learning_rate": 1.359673083626079e-06,
|
| 1370 |
+
"loss": 0.4679,
|
| 1371 |
+
"step": 1910
|
| 1372 |
+
},
|
| 1373 |
+
{
|
| 1374 |
+
"epoch": 2.3647566235366604,
|
| 1375 |
+
"grad_norm": 1.4984855730756004,
|
| 1376 |
+
"learning_rate": 1.3109248694664917e-06,
|
| 1377 |
+
"loss": 0.4688,
|
| 1378 |
+
"step": 1920
|
| 1379 |
+
},
|
| 1380 |
+
{
|
| 1381 |
+
"epoch": 2.377079482439926,
|
| 1382 |
+
"grad_norm": 1.3070944407719698,
|
| 1383 |
+
"learning_rate": 1.262934410134292e-06,
|
| 1384 |
+
"loss": 0.4692,
|
| 1385 |
+
"step": 1930
|
| 1386 |
+
},
|
| 1387 |
+
{
|
| 1388 |
+
"epoch": 2.3894023413431915,
|
| 1389 |
+
"grad_norm": 1.456513284182291,
|
| 1390 |
+
"learning_rate": 1.2157115631143384e-06,
|
| 1391 |
+
"loss": 0.4734,
|
| 1392 |
+
"step": 1940
|
| 1393 |
+
},
|
| 1394 |
+
{
|
| 1395 |
+
"epoch": 2.401725200246457,
|
| 1396 |
+
"grad_norm": 1.4802761342220625,
|
| 1397 |
+
"learning_rate": 1.169266028220004e-06,
|
| 1398 |
+
"loss": 0.4789,
|
| 1399 |
+
"step": 1950
|
| 1400 |
+
},
|
| 1401 |
+
{
|
| 1402 |
+
"epoch": 2.4140480591497226,
|
| 1403 |
+
"grad_norm": 1.4862600637634056,
|
| 1404 |
+
"learning_rate": 1.1236073456007928e-06,
|
| 1405 |
+
"loss": 0.4761,
|
| 1406 |
+
"step": 1960
|
| 1407 |
+
},
|
| 1408 |
+
{
|
| 1409 |
+
"epoch": 2.426370918052988,
|
| 1410 |
+
"grad_norm": 1.436924860825337,
|
| 1411 |
+
"learning_rate": 1.0787448937827428e-06,
|
| 1412 |
+
"loss": 0.4506,
|
| 1413 |
+
"step": 1970
|
| 1414 |
+
},
|
| 1415 |
+
{
|
| 1416 |
+
"epoch": 2.438693776956254,
|
| 1417 |
+
"grad_norm": 1.4729675955814119,
|
| 1418 |
+
"learning_rate": 1.034687887742028e-06,
|
| 1419 |
+
"loss": 0.4661,
|
| 1420 |
+
"step": 1980
|
| 1421 |
+
},
|
| 1422 |
+
{
|
| 1423 |
+
"epoch": 2.4510166358595193,
|
| 1424 |
+
"grad_norm": 1.4365421186657983,
|
| 1425 |
+
"learning_rate": 9.914453770121557e-07,
|
| 1426 |
+
"loss": 0.4535,
|
| 1427 |
+
"step": 1990
|
| 1428 |
+
},
|
| 1429 |
+
{
|
| 1430 |
+
"epoch": 2.463339494762785,
|
| 1431 |
+
"grad_norm": 1.3393762796043207,
|
| 1432 |
+
"learning_rate": 9.490262438251496e-07,
|
| 1433 |
+
"loss": 0.4637,
|
| 1434 |
+
"step": 2000
|
| 1435 |
+
},
|
| 1436 |
+
{
|
| 1437 |
+
"epoch": 2.463339494762785,
|
| 1438 |
+
"eval_loss": 0.6939365267753601,
|
| 1439 |
+
"eval_runtime": 290.4872,
|
| 1440 |
+
"eval_samples_per_second": 19.86,
|
| 1441 |
+
"eval_steps_per_second": 2.485,
|
| 1442 |
+
"step": 2000
|
| 1443 |
+
},
|
| 1444 |
+
{
|
| 1445 |
+
"epoch": 2.4756623536660505,
|
| 1446 |
+
"grad_norm": 1.318600880357867,
|
| 1447 |
+
"learning_rate": 9.07439201287088e-07,
|
| 1448 |
+
"loss": 0.4673,
|
| 1449 |
+
"step": 2010
|
| 1450 |
+
},
|
| 1451 |
+
{
|
| 1452 |
+
"epoch": 2.487985212569316,
|
| 1453 |
+
"grad_norm": 1.4047247838739947,
|
| 1454 |
+
"learning_rate": 8.666927915883905e-07,
|
| 1455 |
+
"loss": 0.4594,
|
| 1456 |
+
"step": 2020
|
| 1457 |
+
},
|
| 1458 |
+
{
|
| 1459 |
+
"epoch": 2.5003080714725816,
|
| 1460 |
+
"grad_norm": 1.3215657962974705,
|
| 1461 |
+
"learning_rate": 8.2679538424921e-07,
|
| 1462 |
+
"loss": 0.4633,
|
| 1463 |
+
"step": 2030
|
| 1464 |
+
},
|
| 1465 |
+
{
|
| 1466 |
+
"epoch": 2.512630930375847,
|
| 1467 |
+
"grad_norm": 1.3898207583964082,
|
| 1468 |
+
"learning_rate": 7.877551744002881e-07,
|
| 1469 |
+
"loss": 0.4716,
|
| 1470 |
+
"step": 2040
|
| 1471 |
+
},
|
| 1472 |
+
{
|
| 1473 |
+
"epoch": 2.5249537892791127,
|
| 1474 |
+
"grad_norm": 1.467272084224576,
|
| 1475 |
+
"learning_rate": 7.495801810996334e-07,
|
| 1476 |
+
"loss": 0.4748,
|
| 1477 |
+
"step": 2050
|
| 1478 |
+
},
|
| 1479 |
+
{
|
| 1480 |
+
"epoch": 2.5372766481823783,
|
| 1481 |
+
"grad_norm": 1.5209575533016215,
|
| 1482 |
+
"learning_rate": 7.122782456853722e-07,
|
| 1483 |
+
"loss": 0.4714,
|
| 1484 |
+
"step": 2060
|
| 1485 |
+
},
|
| 1486 |
+
{
|
| 1487 |
+
"epoch": 2.549599507085644,
|
| 1488 |
+
"grad_norm": 1.433751017027838,
|
| 1489 |
+
"learning_rate": 6.758570301650869e-07,
|
| 1490 |
+
"loss": 0.4745,
|
| 1491 |
+
"step": 2070
|
| 1492 |
+
},
|
| 1493 |
+
{
|
| 1494 |
+
"epoch": 2.5619223659889094,
|
| 1495 |
+
"grad_norm": 1.3745904975610637,
|
| 1496 |
+
"learning_rate": 6.403240156420087e-07,
|
| 1497 |
+
"loss": 0.4633,
|
| 1498 |
+
"step": 2080
|
| 1499 |
+
},
|
| 1500 |
+
{
|
| 1501 |
+
"epoch": 2.574245224892175,
|
| 1502 |
+
"grad_norm": 1.5462474471731622,
|
| 1503 |
+
"learning_rate": 6.056865007783602e-07,
|
| 1504 |
+
"loss": 0.4674,
|
| 1505 |
+
"step": 2090
|
| 1506 |
+
},
|
| 1507 |
+
{
|
| 1508 |
+
"epoch": 2.5865680837954406,
|
| 1509 |
+
"grad_norm": 1.395318352381848,
|
| 1510 |
+
"learning_rate": 5.7195160029617e-07,
|
| 1511 |
+
"loss": 0.4636,
|
| 1512 |
+
"step": 2100
|
| 1513 |
+
},
|
| 1514 |
+
{
|
| 1515 |
+
"epoch": 2.598890942698706,
|
| 1516 |
+
"grad_norm": 1.4508968784638185,
|
| 1517 |
+
"learning_rate": 5.391262435158722e-07,
|
| 1518 |
+
"loss": 0.4612,
|
| 1519 |
+
"step": 2110
|
| 1520 |
+
},
|
| 1521 |
+
{
|
| 1522 |
+
"epoch": 2.6112138016019717,
|
| 1523 |
+
"grad_norm": 1.5179245072065153,
|
| 1524 |
+
"learning_rate": 5.072171729329944e-07,
|
| 1525 |
+
"loss": 0.4548,
|
| 1526 |
+
"step": 2120
|
| 1527 |
+
},
|
| 1528 |
+
{
|
| 1529 |
+
"epoch": 2.6235366605052373,
|
| 1530 |
+
"grad_norm": 1.4994168181906784,
|
| 1531 |
+
"learning_rate": 4.7623094283320905e-07,
|
| 1532 |
+
"loss": 0.4649,
|
| 1533 |
+
"step": 2130
|
| 1534 |
+
},
|
| 1535 |
+
{
|
| 1536 |
+
"epoch": 2.635859519408503,
|
| 1537 |
+
"grad_norm": 1.4135373611963087,
|
| 1538 |
+
"learning_rate": 4.4617391794604946e-07,
|
| 1539 |
+
"loss": 0.4649,
|
| 1540 |
+
"step": 2140
|
| 1541 |
+
},
|
| 1542 |
+
{
|
| 1543 |
+
"epoch": 2.6481823783117684,
|
| 1544 |
+
"grad_norm": 1.2683747528125564,
|
| 1545 |
+
"learning_rate": 4.170522721375669e-07,
|
| 1546 |
+
"loss": 0.4729,
|
| 1547 |
+
"step": 2150
|
| 1548 |
+
},
|
| 1549 |
+
{
|
| 1550 |
+
"epoch": 2.660505237215034,
|
| 1551 |
+
"grad_norm": 1.3525207032019462,
|
| 1552 |
+
"learning_rate": 3.8887198714218255e-07,
|
| 1553 |
+
"loss": 0.4626,
|
| 1554 |
+
"step": 2160
|
| 1555 |
+
},
|
| 1556 |
+
{
|
| 1557 |
+
"epoch": 2.6728280961182995,
|
| 1558 |
+
"grad_norm": 1.4316798644144118,
|
| 1559 |
+
"learning_rate": 3.616388513340124e-07,
|
| 1560 |
+
"loss": 0.4618,
|
| 1561 |
+
"step": 2170
|
| 1562 |
+
},
|
| 1563 |
+
{
|
| 1564 |
+
"epoch": 2.685150955021565,
|
| 1565 |
+
"grad_norm": 1.506137937558267,
|
| 1566 |
+
"learning_rate": 3.3535845853790105e-07,
|
| 1567 |
+
"loss": 0.4576,
|
| 1568 |
+
"step": 2180
|
| 1569 |
+
},
|
| 1570 |
+
{
|
| 1571 |
+
"epoch": 2.6974738139248307,
|
| 1572 |
+
"grad_norm": 1.2499089130102832,
|
| 1573 |
+
"learning_rate": 3.1003620688042636e-07,
|
| 1574 |
+
"loss": 0.4557,
|
| 1575 |
+
"step": 2190
|
| 1576 |
+
},
|
| 1577 |
+
{
|
| 1578 |
+
"epoch": 2.709796672828096,
|
| 1579 |
+
"grad_norm": 1.5041243905421406,
|
| 1580 |
+
"learning_rate": 2.856772976810929e-07,
|
| 1581 |
+
"loss": 0.4687,
|
| 1582 |
+
"step": 2200
|
| 1583 |
+
},
|
| 1584 |
+
{
|
| 1585 |
+
"epoch": 2.722119531731362,
|
| 1586 |
+
"grad_norm": 1.3256641357947336,
|
| 1587 |
+
"learning_rate": 2.6228673438395804e-07,
|
| 1588 |
+
"loss": 0.4617,
|
| 1589 |
+
"step": 2210
|
| 1590 |
+
},
|
| 1591 |
+
{
|
| 1592 |
+
"epoch": 2.7344423906346274,
|
| 1593 |
+
"grad_norm": 1.516883255523505,
|
| 1594 |
+
"learning_rate": 2.398693215298953e-07,
|
| 1595 |
+
"loss": 0.4666,
|
| 1596 |
+
"step": 2220
|
| 1597 |
+
},
|
| 1598 |
+
{
|
| 1599 |
+
"epoch": 2.746765249537893,
|
| 1600 |
+
"grad_norm": 1.2550064770530165,
|
| 1601 |
+
"learning_rate": 2.1842966376972142e-07,
|
| 1602 |
+
"loss": 0.4757,
|
| 1603 |
+
"step": 2230
|
| 1604 |
+
},
|
| 1605 |
+
{
|
| 1606 |
+
"epoch": 2.7590881084411585,
|
| 1607 |
+
"grad_norm": 1.4223713837529477,
|
| 1608 |
+
"learning_rate": 1.9797216491837356e-07,
|
| 1609 |
+
"loss": 0.4634,
|
| 1610 |
+
"step": 2240
|
| 1611 |
+
},
|
| 1612 |
+
{
|
| 1613 |
+
"epoch": 2.771410967344424,
|
| 1614 |
+
"grad_norm": 1.3606827390943357,
|
| 1615 |
+
"learning_rate": 1.7850102705034455e-07,
|
| 1616 |
+
"loss": 0.4694,
|
| 1617 |
+
"step": 2250
|
| 1618 |
+
},
|
| 1619 |
+
{
|
| 1620 |
+
"epoch": 2.7837338262476896,
|
| 1621 |
+
"grad_norm": 1.4381020424607691,
|
| 1622 |
+
"learning_rate": 1.600202496365566e-07,
|
| 1623 |
+
"loss": 0.4637,
|
| 1624 |
+
"step": 2260
|
| 1625 |
+
},
|
| 1626 |
+
{
|
| 1627 |
+
"epoch": 2.796056685150955,
|
| 1628 |
+
"grad_norm": 1.4541821978517326,
|
| 1629 |
+
"learning_rate": 1.425336287228496e-07,
|
| 1630 |
+
"loss": 0.4709,
|
| 1631 |
+
"step": 2270
|
| 1632 |
+
},
|
| 1633 |
+
{
|
| 1634 |
+
"epoch": 2.8083795440542207,
|
| 1635 |
+
"grad_norm": 1.535729509926539,
|
| 1636 |
+
"learning_rate": 1.2604475615025092e-07,
|
| 1637 |
+
"loss": 0.46,
|
| 1638 |
+
"step": 2280
|
| 1639 |
+
},
|
| 1640 |
+
{
|
| 1641 |
+
"epoch": 2.820702402957486,
|
| 1642 |
+
"grad_norm": 1.4756674292181435,
|
| 1643 |
+
"learning_rate": 1.1055701881719838e-07,
|
| 1644 |
+
"loss": 0.4722,
|
| 1645 |
+
"step": 2290
|
| 1646 |
+
},
|
| 1647 |
+
{
|
| 1648 |
+
"epoch": 2.833025261860752,
|
| 1649 |
+
"grad_norm": 1.365953899712207,
|
| 1650 |
+
"learning_rate": 9.607359798384785e-08,
|
| 1651 |
+
"loss": 0.4677,
|
| 1652 |
+
"step": 2300
|
| 1653 |
+
},
|
| 1654 |
+
{
|
| 1655 |
+
"epoch": 2.845348120764017,
|
| 1656 |
+
"grad_norm": 1.2777787545094819,
|
| 1657 |
+
"learning_rate": 8.259746861863094e-08,
|
| 1658 |
+
"loss": 0.4645,
|
| 1659 |
+
"step": 2310
|
| 1660 |
+
},
|
| 1661 |
+
{
|
| 1662 |
+
"epoch": 2.857670979667283,
|
| 1663 |
+
"grad_norm": 1.4119719950612792,
|
| 1664 |
+
"learning_rate": 7.013139878717934e-08,
|
| 1665 |
+
"loss": 0.4659,
|
| 1666 |
+
"step": 2320
|
| 1667 |
+
},
|
| 1668 |
+
{
|
| 1669 |
+
"epoch": 2.869993838570548,
|
| 1670 |
+
"grad_norm": 1.3219403705325738,
|
| 1671 |
+
"learning_rate": 5.8677949083749686e-08,
|
| 1672 |
+
"loss": 0.4717,
|
| 1673 |
+
"step": 2330
|
| 1674 |
+
},
|
| 1675 |
+
{
|
| 1676 |
+
"epoch": 2.882316697473814,
|
| 1677 |
+
"grad_norm": 1.43452858824685,
|
| 1678 |
+
"learning_rate": 4.823947210526647e-08,
|
| 1679 |
+
"loss": 0.4562,
|
| 1680 |
+
"step": 2340
|
| 1681 |
+
},
|
| 1682 |
+
{
|
| 1683 |
+
"epoch": 2.8946395563770793,
|
| 1684 |
+
"grad_norm": 1.4018883778180602,
|
| 1685 |
+
"learning_rate": 3.8818111968083607e-08,
|
| 1686 |
+
"loss": 0.4784,
|
| 1687 |
+
"step": 2350
|
| 1688 |
+
},
|
| 1689 |
+
{
|
| 1690 |
+
"epoch": 2.9069624152803453,
|
| 1691 |
+
"grad_norm": 1.4362249713543827,
|
| 1692 |
+
"learning_rate": 3.041580386757448e-08,
|
| 1693 |
+
"loss": 0.4736,
|
| 1694 |
+
"step": 2360
|
| 1695 |
+
},
|
| 1696 |
+
{
|
| 1697 |
+
"epoch": 2.9192852741836104,
|
| 1698 |
+
"grad_norm": 1.4667038100350953,
|
| 1699 |
+
"learning_rate": 2.3034273680632157e-08,
|
| 1700 |
+
"loss": 0.4658,
|
| 1701 |
+
"step": 2370
|
| 1702 |
+
},
|
| 1703 |
+
{
|
| 1704 |
+
"epoch": 2.9316081330868764,
|
| 1705 |
+
"grad_norm": 1.4777477329312418,
|
| 1706 |
+
"learning_rate": 1.6675037611165735e-08,
|
| 1707 |
+
"loss": 0.4962,
|
| 1708 |
+
"step": 2380
|
| 1709 |
+
},
|
| 1710 |
+
{
|
| 1711 |
+
"epoch": 2.9439309919901415,
|
| 1712 |
+
"grad_norm": 1.4703683347893792,
|
| 1713 |
+
"learning_rate": 1.1339401878663337e-08,
|
| 1714 |
+
"loss": 0.4672,
|
| 1715 |
+
"step": 2390
|
| 1716 |
+
},
|
| 1717 |
+
{
|
| 1718 |
+
"epoch": 2.9562538508934075,
|
| 1719 |
+
"grad_norm": 1.4130266231255557,
|
| 1720 |
+
"learning_rate": 7.028462449889528e-09,
|
| 1721 |
+
"loss": 0.4633,
|
| 1722 |
+
"step": 2400
|
| 1723 |
+
},
|
| 1724 |
+
{
|
| 1725 |
+
"epoch": 2.9685767097966727,
|
| 1726 |
+
"grad_norm": 1.3914037328445572,
|
| 1727 |
+
"learning_rate": 3.743104813767051e-09,
|
| 1728 |
+
"loss": 0.4541,
|
| 1729 |
+
"step": 2410
|
| 1730 |
+
},
|
| 1731 |
+
{
|
| 1732 |
+
"epoch": 2.9808995686999387,
|
| 1733 |
+
"grad_norm": 1.4673797756485851,
|
| 1734 |
+
"learning_rate": 1.4840037994923173e-09,
|
| 1735 |
+
"loss": 0.4567,
|
| 1736 |
+
"step": 2420
|
| 1737 |
+
},
|
| 1738 |
+
{
|
| 1739 |
+
"epoch": 2.993222427603204,
|
| 1740 |
+
"grad_norm": 1.336412676673849,
|
| 1741 |
+
"learning_rate": 2.516234379235094e-10,
|
| 1742 |
+
"loss": 0.4601,
|
| 1743 |
+
"step": 2430
|
| 1744 |
+
},
|
| 1745 |
+
{
|
| 1746 |
+
"epoch": 3.0,
|
| 1747 |
+
"step": 2436,
|
| 1748 |
+
"total_flos": 598743726292992.0,
|
| 1749 |
+
"train_loss": 0.6124914906099317,
|
| 1750 |
+
"train_runtime": 9399.7215,
|
| 1751 |
+
"train_samples_per_second": 16.571,
|
| 1752 |
+
"train_steps_per_second": 0.259
|
| 1753 |
+
}
|
| 1754 |
+
],
|
| 1755 |
+
"logging_steps": 10,
|
| 1756 |
+
"max_steps": 2436,
|
| 1757 |
+
"num_input_tokens_seen": 0,
|
| 1758 |
+
"num_train_epochs": 3,
|
| 1759 |
+
"save_steps": 1000,
|
| 1760 |
+
"stateful_callbacks": {
|
| 1761 |
+
"TrainerControl": {
|
| 1762 |
+
"args": {
|
| 1763 |
+
"should_epoch_stop": false,
|
| 1764 |
+
"should_evaluate": false,
|
| 1765 |
+
"should_log": false,
|
| 1766 |
+
"should_save": true,
|
| 1767 |
+
"should_training_stop": true
|
| 1768 |
+
},
|
| 1769 |
+
"attributes": {}
|
| 1770 |
+
}
|
| 1771 |
+
},
|
| 1772 |
+
"total_flos": 598743726292992.0,
|
| 1773 |
+
"train_batch_size": 4,
|
| 1774 |
+
"trial_name": null,
|
| 1775 |
+
"trial_params": null
|
| 1776 |
+
}
|
EXP_1.1_3b/training_args.bin
ADDED
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:e00930cc5b9b912ef62d9bf9ecb60388b02022d6fc087abd0690f66ed466b9a7
|
| 3 |
+
size 7953
|
EXP_1.1_3b/training_eval_loss.png
ADDED
|
EXP_1.1_3b/training_loss.png
ADDED
|
EXP_1.1_3b/video_preprocessor_config.json
ADDED
|
@@ -0,0 +1,86 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"_valid_kwargs_names": [
|
| 3 |
+
"do_convert_rgb",
|
| 4 |
+
"do_resize",
|
| 5 |
+
"size",
|
| 6 |
+
"size_divisor",
|
| 7 |
+
"default_to_square",
|
| 8 |
+
"resample",
|
| 9 |
+
"do_rescale",
|
| 10 |
+
"rescale_factor",
|
| 11 |
+
"do_normalize",
|
| 12 |
+
"image_mean",
|
| 13 |
+
"image_std",
|
| 14 |
+
"do_pad",
|
| 15 |
+
"do_center_crop",
|
| 16 |
+
"crop_size",
|
| 17 |
+
"data_format",
|
| 18 |
+
"input_data_format",
|
| 19 |
+
"device",
|
| 20 |
+
"min_pixels",
|
| 21 |
+
"max_pixels",
|
| 22 |
+
"patch_size",
|
| 23 |
+
"temporal_patch_size",
|
| 24 |
+
"merge_size"
|
| 25 |
+
],
|
| 26 |
+
"crop_size": null,
|
| 27 |
+
"data_format": "channels_first",
|
| 28 |
+
"default_to_square": true,
|
| 29 |
+
"device": null,
|
| 30 |
+
"do_center_crop": null,
|
| 31 |
+
"do_convert_rgb": true,
|
| 32 |
+
"do_normalize": true,
|
| 33 |
+
"do_pad": null,
|
| 34 |
+
"do_rescale": true,
|
| 35 |
+
"do_resize": true,
|
| 36 |
+
"image_mean": [
|
| 37 |
+
0.48145466,
|
| 38 |
+
0.4578275,
|
| 39 |
+
0.40821073
|
| 40 |
+
],
|
| 41 |
+
"image_processor_type": "Qwen2VLImageProcessor",
|
| 42 |
+
"image_std": [
|
| 43 |
+
0.26862954,
|
| 44 |
+
0.26130258,
|
| 45 |
+
0.27577711
|
| 46 |
+
],
|
| 47 |
+
"input_data_format": null,
|
| 48 |
+
"max_pixels": 12845056,
|
| 49 |
+
"merge_size": 2,
|
| 50 |
+
"min_pixels": 3136,
|
| 51 |
+
"model_valid_processing_keys": [
|
| 52 |
+
"do_convert_rgb",
|
| 53 |
+
"do_resize",
|
| 54 |
+
"size",
|
| 55 |
+
"size_divisor",
|
| 56 |
+
"default_to_square",
|
| 57 |
+
"resample",
|
| 58 |
+
"do_rescale",
|
| 59 |
+
"rescale_factor",
|
| 60 |
+
"do_normalize",
|
| 61 |
+
"image_mean",
|
| 62 |
+
"image_std",
|
| 63 |
+
"do_pad",
|
| 64 |
+
"do_center_crop",
|
| 65 |
+
"crop_size",
|
| 66 |
+
"data_format",
|
| 67 |
+
"input_data_format",
|
| 68 |
+
"device",
|
| 69 |
+
"min_pixels",
|
| 70 |
+
"max_pixels",
|
| 71 |
+
"patch_size",
|
| 72 |
+
"temporal_patch_size",
|
| 73 |
+
"merge_size"
|
| 74 |
+
],
|
| 75 |
+
"patch_size": 14,
|
| 76 |
+
"processor_class": "Qwen2_5_VLProcessor",
|
| 77 |
+
"resample": 3,
|
| 78 |
+
"rescale_factor": 0.00392156862745098,
|
| 79 |
+
"size": {
|
| 80 |
+
"longest_edge": 12845056,
|
| 81 |
+
"shortest_edge": 3136
|
| 82 |
+
},
|
| 83 |
+
"size_divisor": null,
|
| 84 |
+
"temporal_patch_size": 2,
|
| 85 |
+
"video_processor_type": "Qwen2VLVideoProcessor"
|
| 86 |
+
}
|
EXP_1.1_3b/vocab.json
ADDED
|
The diff for this file is too large to render.
See raw diff
|
|
|
EXP_1.2_3b/README.md
ADDED
|
@@ -0,0 +1,63 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
---
|
| 2 |
+
library_name: transformers
|
| 3 |
+
license: other
|
| 4 |
+
base_model: /mnt/nvme/hyz/hf/models/Qwen2.5-VL-3B-Instruct
|
| 5 |
+
tags:
|
| 6 |
+
- llama-factory
|
| 7 |
+
- full
|
| 8 |
+
- generated_from_trainer
|
| 9 |
+
model-index:
|
| 10 |
+
- name: EXP_1.2_3b
|
| 11 |
+
results: []
|
| 12 |
+
---
|
| 13 |
+
|
| 14 |
+
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
| 15 |
+
should probably proofread and complete it, then remove this comment. -->
|
| 16 |
+
|
| 17 |
+
# EXP_1.2_3b
|
| 18 |
+
|
| 19 |
+
This model is a fine-tuned version of [/mnt/nvme/hyz/hf/models/Qwen2.5-VL-3B-Instruct](https://huggingface.co//mnt/nvme/hyz/hf/models/Qwen2.5-VL-3B-Instruct) on the multimodal-open-r1-8k-verified_train_long_cot dataset.
|
| 20 |
+
It achieves the following results on the evaluation set:
|
| 21 |
+
- Loss: 0.4758
|
| 22 |
+
|
| 23 |
+
## Model description
|
| 24 |
+
|
| 25 |
+
More information needed
|
| 26 |
+
|
| 27 |
+
## Intended uses & limitations
|
| 28 |
+
|
| 29 |
+
More information needed
|
| 30 |
+
|
| 31 |
+
## Training and evaluation data
|
| 32 |
+
|
| 33 |
+
More information needed
|
| 34 |
+
|
| 35 |
+
## Training procedure
|
| 36 |
+
|
| 37 |
+
### Training hyperparameters
|
| 38 |
+
|
| 39 |
+
The following hyperparameters were used during training:
|
| 40 |
+
- learning_rate: 1e-05
|
| 41 |
+
- train_batch_size: 4
|
| 42 |
+
- eval_batch_size: 1
|
| 43 |
+
- seed: 42
|
| 44 |
+
- distributed_type: multi-GPU
|
| 45 |
+
- num_devices: 8
|
| 46 |
+
- gradient_accumulation_steps: 2
|
| 47 |
+
- total_train_batch_size: 64
|
| 48 |
+
- total_eval_batch_size: 8
|
| 49 |
+
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
|
| 50 |
+
- lr_scheduler_type: cosine
|
| 51 |
+
- lr_scheduler_warmup_ratio: 0.1
|
| 52 |
+
- num_epochs: 3.0
|
| 53 |
+
|
| 54 |
+
### Training results
|
| 55 |
+
|
| 56 |
+
|
| 57 |
+
|
| 58 |
+
### Framework versions
|
| 59 |
+
|
| 60 |
+
- Transformers 4.52.4
|
| 61 |
+
- Pytorch 2.7.1+cu126
|
| 62 |
+
- Datasets 3.6.0
|
| 63 |
+
- Tokenizers 0.21.1
|
EXP_1.2_3b/added_tokens.json
ADDED
|
@@ -0,0 +1,24 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"</tool_call>": 151658,
|
| 3 |
+
"<tool_call>": 151657,
|
| 4 |
+
"<|box_end|>": 151649,
|
| 5 |
+
"<|box_start|>": 151648,
|
| 6 |
+
"<|endoftext|>": 151643,
|
| 7 |
+
"<|file_sep|>": 151664,
|
| 8 |
+
"<|fim_middle|>": 151660,
|
| 9 |
+
"<|fim_pad|>": 151662,
|
| 10 |
+
"<|fim_prefix|>": 151659,
|
| 11 |
+
"<|fim_suffix|>": 151661,
|
| 12 |
+
"<|im_end|>": 151645,
|
| 13 |
+
"<|im_start|>": 151644,
|
| 14 |
+
"<|image_pad|>": 151655,
|
| 15 |
+
"<|object_ref_end|>": 151647,
|
| 16 |
+
"<|object_ref_start|>": 151646,
|
| 17 |
+
"<|quad_end|>": 151651,
|
| 18 |
+
"<|quad_start|>": 151650,
|
| 19 |
+
"<|repo_name|>": 151663,
|
| 20 |
+
"<|video_pad|>": 151656,
|
| 21 |
+
"<|vision_end|>": 151653,
|
| 22 |
+
"<|vision_pad|>": 151654,
|
| 23 |
+
"<|vision_start|>": 151652
|
| 24 |
+
}
|
EXP_1.2_3b/all_results.json
ADDED
|
@@ -0,0 +1,12 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"epoch": 3.0,
|
| 3 |
+
"eval_loss": 0.47578269243240356,
|
| 4 |
+
"eval_runtime": 39.7643,
|
| 5 |
+
"eval_samples_per_second": 19.339,
|
| 6 |
+
"eval_steps_per_second": 2.439,
|
| 7 |
+
"total_flos": 56487017447424.0,
|
| 8 |
+
"train_loss": 0.33897173769248007,
|
| 9 |
+
"train_runtime": 796.6909,
|
| 10 |
+
"train_samples_per_second": 26.058,
|
| 11 |
+
"train_steps_per_second": 0.41
|
| 12 |
+
}
|
EXP_1.2_3b/chat_template.jinja
ADDED
|
@@ -0,0 +1,7 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{% set image_count = namespace(value=0) %}{% set video_count = namespace(value=0) %}{% for message in messages %}{% if loop.first and message['role'] != 'system' %}<|im_start|>system
|
| 2 |
+
You are a helpful assistant.<|im_end|>
|
| 3 |
+
{% endif %}<|im_start|>{{ message['role'] }}
|
| 4 |
+
{% if message['content'] is string %}{{ message['content'] }}<|im_end|>
|
| 5 |
+
{% else %}{% for content in message['content'] %}{% if content['type'] == 'image' or 'image' in content or 'image_url' in content %}{% set image_count.value = image_count.value + 1 %}{% if add_vision_id %}Picture {{ image_count.value }}: {% endif %}<|vision_start|><|image_pad|><|vision_end|>{% elif content['type'] == 'video' or 'video' in content %}{% set video_count.value = video_count.value + 1 %}{% if add_vision_id %}Video {{ video_count.value }}: {% endif %}<|vision_start|><|video_pad|><|vision_end|>{% elif 'text' in content %}{{ content['text'] }}{% endif %}{% endfor %}<|im_end|>
|
| 6 |
+
{% endif %}{% endfor %}{% if add_generation_prompt %}<|im_start|>assistant
|
| 7 |
+
{% endif %}
|
EXP_1.2_3b/config.json
ADDED
|
@@ -0,0 +1,105 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"architectures": [
|
| 3 |
+
"Qwen2_5_VLForConditionalGeneration"
|
| 4 |
+
],
|
| 5 |
+
"attention_dropout": 0.0,
|
| 6 |
+
"bos_token_id": 151643,
|
| 7 |
+
"eos_token_id": 151645,
|
| 8 |
+
"hidden_act": "silu",
|
| 9 |
+
"hidden_size": 2048,
|
| 10 |
+
"image_token_id": 151655,
|
| 11 |
+
"initializer_range": 0.02,
|
| 12 |
+
"intermediate_size": 11008,
|
| 13 |
+
"max_position_embeddings": 128000,
|
| 14 |
+
"max_window_layers": 70,
|
| 15 |
+
"model_type": "qwen2_5_vl",
|
| 16 |
+
"num_attention_heads": 16,
|
| 17 |
+
"num_hidden_layers": 36,
|
| 18 |
+
"num_key_value_heads": 2,
|
| 19 |
+
"rms_norm_eps": 1e-06,
|
| 20 |
+
"rope_scaling": {
|
| 21 |
+
"mrope_section": [
|
| 22 |
+
16,
|
| 23 |
+
24,
|
| 24 |
+
24
|
| 25 |
+
],
|
| 26 |
+
"rope_type": "default",
|
| 27 |
+
"type": "default"
|
| 28 |
+
},
|
| 29 |
+
"rope_theta": 1000000.0,
|
| 30 |
+
"sliding_window": 32768,
|
| 31 |
+
"text_config": {
|
| 32 |
+
"architectures": [
|
| 33 |
+
"Qwen2_5_VLForConditionalGeneration"
|
| 34 |
+
],
|
| 35 |
+
"attention_dropout": 0.0,
|
| 36 |
+
"bos_token_id": 151643,
|
| 37 |
+
"eos_token_id": 151645,
|
| 38 |
+
"hidden_act": "silu",
|
| 39 |
+
"hidden_size": 2048,
|
| 40 |
+
"image_token_id": null,
|
| 41 |
+
"initializer_range": 0.02,
|
| 42 |
+
"intermediate_size": 11008,
|
| 43 |
+
"max_position_embeddings": 128000,
|
| 44 |
+
"max_window_layers": 70,
|
| 45 |
+
"model_type": "qwen2_5_vl_text",
|
| 46 |
+
"num_attention_heads": 16,
|
| 47 |
+
"num_hidden_layers": 36,
|
| 48 |
+
"num_key_value_heads": 2,
|
| 49 |
+
"rms_norm_eps": 1e-06,
|
| 50 |
+
"rope_scaling": {
|
| 51 |
+
"mrope_section": [
|
| 52 |
+
16,
|
| 53 |
+
24,
|
| 54 |
+
24
|
| 55 |
+
],
|
| 56 |
+
"rope_type": "default",
|
| 57 |
+
"type": "default"
|
| 58 |
+
},
|
| 59 |
+
"rope_theta": 1000000.0,
|
| 60 |
+
"sliding_window": 32768,
|
| 61 |
+
"tie_word_embeddings": true,
|
| 62 |
+
"torch_dtype": "float32",
|
| 63 |
+
"use_cache": false,
|
| 64 |
+
"use_sliding_window": false,
|
| 65 |
+
"video_token_id": null,
|
| 66 |
+
"vision_end_token_id": 151653,
|
| 67 |
+
"vision_start_token_id": 151652,
|
| 68 |
+
"vision_token_id": 151654,
|
| 69 |
+
"vocab_size": 151936
|
| 70 |
+
},
|
| 71 |
+
"torch_dtype": "bfloat16",
|
| 72 |
+
"transformers_version": "4.52.4",
|
| 73 |
+
"use_cache": false,
|
| 74 |
+
"use_sliding_window": false,
|
| 75 |
+
"video_token_id": 151656,
|
| 76 |
+
"vision_config": {
|
| 77 |
+
"depth": 32,
|
| 78 |
+
"fullatt_block_indexes": [
|
| 79 |
+
7,
|
| 80 |
+
15,
|
| 81 |
+
23,
|
| 82 |
+
31
|
| 83 |
+
],
|
| 84 |
+
"hidden_act": "silu",
|
| 85 |
+
"hidden_size": 1280,
|
| 86 |
+
"in_channels": 3,
|
| 87 |
+
"in_chans": 3,
|
| 88 |
+
"initializer_range": 0.02,
|
| 89 |
+
"intermediate_size": 3420,
|
| 90 |
+
"model_type": "qwen2_5_vl",
|
| 91 |
+
"num_heads": 16,
|
| 92 |
+
"out_hidden_size": 2048,
|
| 93 |
+
"patch_size": 14,
|
| 94 |
+
"spatial_merge_size": 2,
|
| 95 |
+
"spatial_patch_size": 14,
|
| 96 |
+
"temporal_patch_size": 2,
|
| 97 |
+
"tokens_per_second": 2,
|
| 98 |
+
"torch_dtype": "float32",
|
| 99 |
+
"window_size": 112
|
| 100 |
+
},
|
| 101 |
+
"vision_end_token_id": 151653,
|
| 102 |
+
"vision_start_token_id": 151652,
|
| 103 |
+
"vision_token_id": 151654,
|
| 104 |
+
"vocab_size": 151936
|
| 105 |
+
}
|
EXP_1.2_3b/eval_results.json
ADDED
|
@@ -0,0 +1,7 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"epoch": 3.0,
|
| 3 |
+
"eval_loss": 0.47578269243240356,
|
| 4 |
+
"eval_runtime": 39.7643,
|
| 5 |
+
"eval_samples_per_second": 19.339,
|
| 6 |
+
"eval_steps_per_second": 2.439
|
| 7 |
+
}
|
EXP_1.2_3b/generation_config.json
ADDED
|
@@ -0,0 +1,12 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"bos_token_id": 151643,
|
| 3 |
+
"do_sample": true,
|
| 4 |
+
"eos_token_id": [
|
| 5 |
+
151645,
|
| 6 |
+
151643
|
| 7 |
+
],
|
| 8 |
+
"pad_token_id": 151643,
|
| 9 |
+
"repetition_penalty": 1.05,
|
| 10 |
+
"temperature": 1e-06,
|
| 11 |
+
"transformers_version": "4.52.4"
|
| 12 |
+
}
|
EXP_1.2_3b/merges.txt
ADDED
|
The diff for this file is too large to render.
See raw diff
|
|
|
EXP_1.2_3b/model-00001-of-00002.safetensors
ADDED
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:cca91d23c884ea32ec9c9a82275f94153d501c4fdf693c085f5802ee38054a8f
|
| 3 |
+
size 4997750760
|
EXP_1.2_3b/model-00002-of-00002.safetensors
ADDED
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:b79e238572fa6c8a446cc5cd858f7755e96d6326beaa069e56d6ec53d4746cb1
|
| 3 |
+
size 2511587184
|
EXP_1.2_3b/model.safetensors.index.json
ADDED
|
@@ -0,0 +1,831 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"metadata": {
|
| 3 |
+
"total_size": 7509245952
|
| 4 |
+
},
|
| 5 |
+
"weight_map": {
|
| 6 |
+
"model.embed_tokens.weight": "model-00001-of-00002.safetensors",
|
| 7 |
+
"model.layers.0.input_layernorm.weight": "model-00001-of-00002.safetensors",
|
| 8 |
+
"model.layers.0.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 9 |
+
"model.layers.0.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 10 |
+
"model.layers.0.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 11 |
+
"model.layers.0.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
|
| 12 |
+
"model.layers.0.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
|
| 13 |
+
"model.layers.0.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
|
| 14 |
+
"model.layers.0.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
|
| 15 |
+
"model.layers.0.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
|
| 16 |
+
"model.layers.0.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
|
| 17 |
+
"model.layers.0.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
|
| 18 |
+
"model.layers.0.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
|
| 19 |
+
"model.layers.1.input_layernorm.weight": "model-00001-of-00002.safetensors",
|
| 20 |
+
"model.layers.1.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 21 |
+
"model.layers.1.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 22 |
+
"model.layers.1.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 23 |
+
"model.layers.1.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
|
| 24 |
+
"model.layers.1.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
|
| 25 |
+
"model.layers.1.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
|
| 26 |
+
"model.layers.1.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
|
| 27 |
+
"model.layers.1.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
|
| 28 |
+
"model.layers.1.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
|
| 29 |
+
"model.layers.1.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
|
| 30 |
+
"model.layers.1.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
|
| 31 |
+
"model.layers.10.input_layernorm.weight": "model-00001-of-00002.safetensors",
|
| 32 |
+
"model.layers.10.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 33 |
+
"model.layers.10.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 34 |
+
"model.layers.10.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 35 |
+
"model.layers.10.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
|
| 36 |
+
"model.layers.10.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
|
| 37 |
+
"model.layers.10.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
|
| 38 |
+
"model.layers.10.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
|
| 39 |
+
"model.layers.10.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
|
| 40 |
+
"model.layers.10.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
|
| 41 |
+
"model.layers.10.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
|
| 42 |
+
"model.layers.10.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
|
| 43 |
+
"model.layers.11.input_layernorm.weight": "model-00001-of-00002.safetensors",
|
| 44 |
+
"model.layers.11.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 45 |
+
"model.layers.11.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 46 |
+
"model.layers.11.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 47 |
+
"model.layers.11.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
|
| 48 |
+
"model.layers.11.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
|
| 49 |
+
"model.layers.11.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
|
| 50 |
+
"model.layers.11.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
|
| 51 |
+
"model.layers.11.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
|
| 52 |
+
"model.layers.11.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
|
| 53 |
+
"model.layers.11.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
|
| 54 |
+
"model.layers.11.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
|
| 55 |
+
"model.layers.12.input_layernorm.weight": "model-00001-of-00002.safetensors",
|
| 56 |
+
"model.layers.12.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 57 |
+
"model.layers.12.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 58 |
+
"model.layers.12.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 59 |
+
"model.layers.12.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
|
| 60 |
+
"model.layers.12.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
|
| 61 |
+
"model.layers.12.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
|
| 62 |
+
"model.layers.12.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
|
| 63 |
+
"model.layers.12.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
|
| 64 |
+
"model.layers.12.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
|
| 65 |
+
"model.layers.12.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
|
| 66 |
+
"model.layers.12.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
|
| 67 |
+
"model.layers.13.input_layernorm.weight": "model-00001-of-00002.safetensors",
|
| 68 |
+
"model.layers.13.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 69 |
+
"model.layers.13.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 70 |
+
"model.layers.13.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 71 |
+
"model.layers.13.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
|
| 72 |
+
"model.layers.13.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
|
| 73 |
+
"model.layers.13.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
|
| 74 |
+
"model.layers.13.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
|
| 75 |
+
"model.layers.13.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
|
| 76 |
+
"model.layers.13.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
|
| 77 |
+
"model.layers.13.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
|
| 78 |
+
"model.layers.13.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
|
| 79 |
+
"model.layers.14.input_layernorm.weight": "model-00001-of-00002.safetensors",
|
| 80 |
+
"model.layers.14.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 81 |
+
"model.layers.14.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 82 |
+
"model.layers.14.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 83 |
+
"model.layers.14.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
|
| 84 |
+
"model.layers.14.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
|
| 85 |
+
"model.layers.14.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
|
| 86 |
+
"model.layers.14.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
|
| 87 |
+
"model.layers.14.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
|
| 88 |
+
"model.layers.14.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
|
| 89 |
+
"model.layers.14.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
|
| 90 |
+
"model.layers.14.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
|
| 91 |
+
"model.layers.15.input_layernorm.weight": "model-00001-of-00002.safetensors",
|
| 92 |
+
"model.layers.15.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 93 |
+
"model.layers.15.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 94 |
+
"model.layers.15.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 95 |
+
"model.layers.15.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
|
| 96 |
+
"model.layers.15.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
|
| 97 |
+
"model.layers.15.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
|
| 98 |
+
"model.layers.15.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
|
| 99 |
+
"model.layers.15.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
|
| 100 |
+
"model.layers.15.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
|
| 101 |
+
"model.layers.15.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
|
| 102 |
+
"model.layers.15.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
|
| 103 |
+
"model.layers.16.input_layernorm.weight": "model-00001-of-00002.safetensors",
|
| 104 |
+
"model.layers.16.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 105 |
+
"model.layers.16.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 106 |
+
"model.layers.16.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 107 |
+
"model.layers.16.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
|
| 108 |
+
"model.layers.16.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
|
| 109 |
+
"model.layers.16.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
|
| 110 |
+
"model.layers.16.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
|
| 111 |
+
"model.layers.16.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
|
| 112 |
+
"model.layers.16.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
|
| 113 |
+
"model.layers.16.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
|
| 114 |
+
"model.layers.16.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
|
| 115 |
+
"model.layers.17.input_layernorm.weight": "model-00001-of-00002.safetensors",
|
| 116 |
+
"model.layers.17.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 117 |
+
"model.layers.17.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 118 |
+
"model.layers.17.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 119 |
+
"model.layers.17.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
|
| 120 |
+
"model.layers.17.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
|
| 121 |
+
"model.layers.17.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
|
| 122 |
+
"model.layers.17.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
|
| 123 |
+
"model.layers.17.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
|
| 124 |
+
"model.layers.17.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
|
| 125 |
+
"model.layers.17.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
|
| 126 |
+
"model.layers.17.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
|
| 127 |
+
"model.layers.18.input_layernorm.weight": "model-00001-of-00002.safetensors",
|
| 128 |
+
"model.layers.18.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 129 |
+
"model.layers.18.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 130 |
+
"model.layers.18.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 131 |
+
"model.layers.18.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
|
| 132 |
+
"model.layers.18.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
|
| 133 |
+
"model.layers.18.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
|
| 134 |
+
"model.layers.18.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
|
| 135 |
+
"model.layers.18.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
|
| 136 |
+
"model.layers.18.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
|
| 137 |
+
"model.layers.18.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
|
| 138 |
+
"model.layers.18.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
|
| 139 |
+
"model.layers.19.input_layernorm.weight": "model-00002-of-00002.safetensors",
|
| 140 |
+
"model.layers.19.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
|
| 141 |
+
"model.layers.19.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 142 |
+
"model.layers.19.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 143 |
+
"model.layers.19.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
|
| 144 |
+
"model.layers.19.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
|
| 145 |
+
"model.layers.19.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
|
| 146 |
+
"model.layers.19.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
|
| 147 |
+
"model.layers.19.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
|
| 148 |
+
"model.layers.19.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
|
| 149 |
+
"model.layers.19.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
|
| 150 |
+
"model.layers.19.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
|
| 151 |
+
"model.layers.2.input_layernorm.weight": "model-00001-of-00002.safetensors",
|
| 152 |
+
"model.layers.2.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 153 |
+
"model.layers.2.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 154 |
+
"model.layers.2.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 155 |
+
"model.layers.2.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
|
| 156 |
+
"model.layers.2.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
|
| 157 |
+
"model.layers.2.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
|
| 158 |
+
"model.layers.2.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
|
| 159 |
+
"model.layers.2.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
|
| 160 |
+
"model.layers.2.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
|
| 161 |
+
"model.layers.2.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
|
| 162 |
+
"model.layers.2.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
|
| 163 |
+
"model.layers.20.input_layernorm.weight": "model-00002-of-00002.safetensors",
|
| 164 |
+
"model.layers.20.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
|
| 165 |
+
"model.layers.20.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
|
| 166 |
+
"model.layers.20.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
|
| 167 |
+
"model.layers.20.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
|
| 168 |
+
"model.layers.20.self_attn.k_proj.bias": "model-00002-of-00002.safetensors",
|
| 169 |
+
"model.layers.20.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
|
| 170 |
+
"model.layers.20.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
|
| 171 |
+
"model.layers.20.self_attn.q_proj.bias": "model-00002-of-00002.safetensors",
|
| 172 |
+
"model.layers.20.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
|
| 173 |
+
"model.layers.20.self_attn.v_proj.bias": "model-00002-of-00002.safetensors",
|
| 174 |
+
"model.layers.20.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
|
| 175 |
+
"model.layers.21.input_layernorm.weight": "model-00002-of-00002.safetensors",
|
| 176 |
+
"model.layers.21.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
|
| 177 |
+
"model.layers.21.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
|
| 178 |
+
"model.layers.21.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
|
| 179 |
+
"model.layers.21.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
|
| 180 |
+
"model.layers.21.self_attn.k_proj.bias": "model-00002-of-00002.safetensors",
|
| 181 |
+
"model.layers.21.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
|
| 182 |
+
"model.layers.21.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
|
| 183 |
+
"model.layers.21.self_attn.q_proj.bias": "model-00002-of-00002.safetensors",
|
| 184 |
+
"model.layers.21.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
|
| 185 |
+
"model.layers.21.self_attn.v_proj.bias": "model-00002-of-00002.safetensors",
|
| 186 |
+
"model.layers.21.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
|
| 187 |
+
"model.layers.22.input_layernorm.weight": "model-00002-of-00002.safetensors",
|
| 188 |
+
"model.layers.22.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
|
| 189 |
+
"model.layers.22.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
|
| 190 |
+
"model.layers.22.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
|
| 191 |
+
"model.layers.22.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
|
| 192 |
+
"model.layers.22.self_attn.k_proj.bias": "model-00002-of-00002.safetensors",
|
| 193 |
+
"model.layers.22.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
|
| 194 |
+
"model.layers.22.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
|
| 195 |
+
"model.layers.22.self_attn.q_proj.bias": "model-00002-of-00002.safetensors",
|
| 196 |
+
"model.layers.22.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
|
| 197 |
+
"model.layers.22.self_attn.v_proj.bias": "model-00002-of-00002.safetensors",
|
| 198 |
+
"model.layers.22.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
|
| 199 |
+
"model.layers.23.input_layernorm.weight": "model-00002-of-00002.safetensors",
|
| 200 |
+
"model.layers.23.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
|
| 201 |
+
"model.layers.23.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
|
| 202 |
+
"model.layers.23.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
|
| 203 |
+
"model.layers.23.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
|
| 204 |
+
"model.layers.23.self_attn.k_proj.bias": "model-00002-of-00002.safetensors",
|
| 205 |
+
"model.layers.23.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
|
| 206 |
+
"model.layers.23.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
|
| 207 |
+
"model.layers.23.self_attn.q_proj.bias": "model-00002-of-00002.safetensors",
|
| 208 |
+
"model.layers.23.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
|
| 209 |
+
"model.layers.23.self_attn.v_proj.bias": "model-00002-of-00002.safetensors",
|
| 210 |
+
"model.layers.23.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
|
| 211 |
+
"model.layers.24.input_layernorm.weight": "model-00002-of-00002.safetensors",
|
| 212 |
+
"model.layers.24.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
|
| 213 |
+
"model.layers.24.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
|
| 214 |
+
"model.layers.24.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
|
| 215 |
+
"model.layers.24.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
|
| 216 |
+
"model.layers.24.self_attn.k_proj.bias": "model-00002-of-00002.safetensors",
|
| 217 |
+
"model.layers.24.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
|
| 218 |
+
"model.layers.24.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
|
| 219 |
+
"model.layers.24.self_attn.q_proj.bias": "model-00002-of-00002.safetensors",
|
| 220 |
+
"model.layers.24.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
|
| 221 |
+
"model.layers.24.self_attn.v_proj.bias": "model-00002-of-00002.safetensors",
|
| 222 |
+
"model.layers.24.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
|
| 223 |
+
"model.layers.25.input_layernorm.weight": "model-00002-of-00002.safetensors",
|
| 224 |
+
"model.layers.25.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
|
| 225 |
+
"model.layers.25.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
|
| 226 |
+
"model.layers.25.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
|
| 227 |
+
"model.layers.25.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
|
| 228 |
+
"model.layers.25.self_attn.k_proj.bias": "model-00002-of-00002.safetensors",
|
| 229 |
+
"model.layers.25.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
|
| 230 |
+
"model.layers.25.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
|
| 231 |
+
"model.layers.25.self_attn.q_proj.bias": "model-00002-of-00002.safetensors",
|
| 232 |
+
"model.layers.25.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
|
| 233 |
+
"model.layers.25.self_attn.v_proj.bias": "model-00002-of-00002.safetensors",
|
| 234 |
+
"model.layers.25.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
|
| 235 |
+
"model.layers.26.input_layernorm.weight": "model-00002-of-00002.safetensors",
|
| 236 |
+
"model.layers.26.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
|
| 237 |
+
"model.layers.26.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
|
| 238 |
+
"model.layers.26.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
|
| 239 |
+
"model.layers.26.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
|
| 240 |
+
"model.layers.26.self_attn.k_proj.bias": "model-00002-of-00002.safetensors",
|
| 241 |
+
"model.layers.26.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
|
| 242 |
+
"model.layers.26.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
|
| 243 |
+
"model.layers.26.self_attn.q_proj.bias": "model-00002-of-00002.safetensors",
|
| 244 |
+
"model.layers.26.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
|
| 245 |
+
"model.layers.26.self_attn.v_proj.bias": "model-00002-of-00002.safetensors",
|
| 246 |
+
"model.layers.26.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
|
| 247 |
+
"model.layers.27.input_layernorm.weight": "model-00002-of-00002.safetensors",
|
| 248 |
+
"model.layers.27.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
|
| 249 |
+
"model.layers.27.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
|
| 250 |
+
"model.layers.27.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
|
| 251 |
+
"model.layers.27.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
|
| 252 |
+
"model.layers.27.self_attn.k_proj.bias": "model-00002-of-00002.safetensors",
|
| 253 |
+
"model.layers.27.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
|
| 254 |
+
"model.layers.27.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
|
| 255 |
+
"model.layers.27.self_attn.q_proj.bias": "model-00002-of-00002.safetensors",
|
| 256 |
+
"model.layers.27.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
|
| 257 |
+
"model.layers.27.self_attn.v_proj.bias": "model-00002-of-00002.safetensors",
|
| 258 |
+
"model.layers.27.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
|
| 259 |
+
"model.layers.28.input_layernorm.weight": "model-00002-of-00002.safetensors",
|
| 260 |
+
"model.layers.28.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
|
| 261 |
+
"model.layers.28.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
|
| 262 |
+
"model.layers.28.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
|
| 263 |
+
"model.layers.28.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
|
| 264 |
+
"model.layers.28.self_attn.k_proj.bias": "model-00002-of-00002.safetensors",
|
| 265 |
+
"model.layers.28.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
|
| 266 |
+
"model.layers.28.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
|
| 267 |
+
"model.layers.28.self_attn.q_proj.bias": "model-00002-of-00002.safetensors",
|
| 268 |
+
"model.layers.28.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
|
| 269 |
+
"model.layers.28.self_attn.v_proj.bias": "model-00002-of-00002.safetensors",
|
| 270 |
+
"model.layers.28.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
|
| 271 |
+
"model.layers.29.input_layernorm.weight": "model-00002-of-00002.safetensors",
|
| 272 |
+
"model.layers.29.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
|
| 273 |
+
"model.layers.29.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
|
| 274 |
+
"model.layers.29.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
|
| 275 |
+
"model.layers.29.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
|
| 276 |
+
"model.layers.29.self_attn.k_proj.bias": "model-00002-of-00002.safetensors",
|
| 277 |
+
"model.layers.29.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
|
| 278 |
+
"model.layers.29.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
|
| 279 |
+
"model.layers.29.self_attn.q_proj.bias": "model-00002-of-00002.safetensors",
|
| 280 |
+
"model.layers.29.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
|
| 281 |
+
"model.layers.29.self_attn.v_proj.bias": "model-00002-of-00002.safetensors",
|
| 282 |
+
"model.layers.29.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
|
| 283 |
+
"model.layers.3.input_layernorm.weight": "model-00001-of-00002.safetensors",
|
| 284 |
+
"model.layers.3.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 285 |
+
"model.layers.3.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 286 |
+
"model.layers.3.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 287 |
+
"model.layers.3.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
|
| 288 |
+
"model.layers.3.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
|
| 289 |
+
"model.layers.3.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
|
| 290 |
+
"model.layers.3.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
|
| 291 |
+
"model.layers.3.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
|
| 292 |
+
"model.layers.3.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
|
| 293 |
+
"model.layers.3.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
|
| 294 |
+
"model.layers.3.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
|
| 295 |
+
"model.layers.30.input_layernorm.weight": "model-00002-of-00002.safetensors",
|
| 296 |
+
"model.layers.30.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
|
| 297 |
+
"model.layers.30.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
|
| 298 |
+
"model.layers.30.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
|
| 299 |
+
"model.layers.30.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
|
| 300 |
+
"model.layers.30.self_attn.k_proj.bias": "model-00002-of-00002.safetensors",
|
| 301 |
+
"model.layers.30.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
|
| 302 |
+
"model.layers.30.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
|
| 303 |
+
"model.layers.30.self_attn.q_proj.bias": "model-00002-of-00002.safetensors",
|
| 304 |
+
"model.layers.30.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
|
| 305 |
+
"model.layers.30.self_attn.v_proj.bias": "model-00002-of-00002.safetensors",
|
| 306 |
+
"model.layers.30.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
|
| 307 |
+
"model.layers.31.input_layernorm.weight": "model-00002-of-00002.safetensors",
|
| 308 |
+
"model.layers.31.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
|
| 309 |
+
"model.layers.31.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
|
| 310 |
+
"model.layers.31.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
|
| 311 |
+
"model.layers.31.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
|
| 312 |
+
"model.layers.31.self_attn.k_proj.bias": "model-00002-of-00002.safetensors",
|
| 313 |
+
"model.layers.31.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
|
| 314 |
+
"model.layers.31.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
|
| 315 |
+
"model.layers.31.self_attn.q_proj.bias": "model-00002-of-00002.safetensors",
|
| 316 |
+
"model.layers.31.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
|
| 317 |
+
"model.layers.31.self_attn.v_proj.bias": "model-00002-of-00002.safetensors",
|
| 318 |
+
"model.layers.31.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
|
| 319 |
+
"model.layers.32.input_layernorm.weight": "model-00002-of-00002.safetensors",
|
| 320 |
+
"model.layers.32.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
|
| 321 |
+
"model.layers.32.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
|
| 322 |
+
"model.layers.32.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
|
| 323 |
+
"model.layers.32.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
|
| 324 |
+
"model.layers.32.self_attn.k_proj.bias": "model-00002-of-00002.safetensors",
|
| 325 |
+
"model.layers.32.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
|
| 326 |
+
"model.layers.32.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
|
| 327 |
+
"model.layers.32.self_attn.q_proj.bias": "model-00002-of-00002.safetensors",
|
| 328 |
+
"model.layers.32.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
|
| 329 |
+
"model.layers.32.self_attn.v_proj.bias": "model-00002-of-00002.safetensors",
|
| 330 |
+
"model.layers.32.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
|
| 331 |
+
"model.layers.33.input_layernorm.weight": "model-00002-of-00002.safetensors",
|
| 332 |
+
"model.layers.33.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
|
| 333 |
+
"model.layers.33.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
|
| 334 |
+
"model.layers.33.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
|
| 335 |
+
"model.layers.33.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
|
| 336 |
+
"model.layers.33.self_attn.k_proj.bias": "model-00002-of-00002.safetensors",
|
| 337 |
+
"model.layers.33.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
|
| 338 |
+
"model.layers.33.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
|
| 339 |
+
"model.layers.33.self_attn.q_proj.bias": "model-00002-of-00002.safetensors",
|
| 340 |
+
"model.layers.33.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
|
| 341 |
+
"model.layers.33.self_attn.v_proj.bias": "model-00002-of-00002.safetensors",
|
| 342 |
+
"model.layers.33.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
|
| 343 |
+
"model.layers.34.input_layernorm.weight": "model-00002-of-00002.safetensors",
|
| 344 |
+
"model.layers.34.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
|
| 345 |
+
"model.layers.34.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
|
| 346 |
+
"model.layers.34.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
|
| 347 |
+
"model.layers.34.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
|
| 348 |
+
"model.layers.34.self_attn.k_proj.bias": "model-00002-of-00002.safetensors",
|
| 349 |
+
"model.layers.34.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
|
| 350 |
+
"model.layers.34.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
|
| 351 |
+
"model.layers.34.self_attn.q_proj.bias": "model-00002-of-00002.safetensors",
|
| 352 |
+
"model.layers.34.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
|
| 353 |
+
"model.layers.34.self_attn.v_proj.bias": "model-00002-of-00002.safetensors",
|
| 354 |
+
"model.layers.34.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
|
| 355 |
+
"model.layers.35.input_layernorm.weight": "model-00002-of-00002.safetensors",
|
| 356 |
+
"model.layers.35.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
|
| 357 |
+
"model.layers.35.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
|
| 358 |
+
"model.layers.35.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
|
| 359 |
+
"model.layers.35.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
|
| 360 |
+
"model.layers.35.self_attn.k_proj.bias": "model-00002-of-00002.safetensors",
|
| 361 |
+
"model.layers.35.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
|
| 362 |
+
"model.layers.35.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
|
| 363 |
+
"model.layers.35.self_attn.q_proj.bias": "model-00002-of-00002.safetensors",
|
| 364 |
+
"model.layers.35.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
|
| 365 |
+
"model.layers.35.self_attn.v_proj.bias": "model-00002-of-00002.safetensors",
|
| 366 |
+
"model.layers.35.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
|
| 367 |
+
"model.layers.4.input_layernorm.weight": "model-00001-of-00002.safetensors",
|
| 368 |
+
"model.layers.4.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 369 |
+
"model.layers.4.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 370 |
+
"model.layers.4.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 371 |
+
"model.layers.4.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
|
| 372 |
+
"model.layers.4.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
|
| 373 |
+
"model.layers.4.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
|
| 374 |
+
"model.layers.4.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
|
| 375 |
+
"model.layers.4.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
|
| 376 |
+
"model.layers.4.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
|
| 377 |
+
"model.layers.4.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
|
| 378 |
+
"model.layers.4.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
|
| 379 |
+
"model.layers.5.input_layernorm.weight": "model-00001-of-00002.safetensors",
|
| 380 |
+
"model.layers.5.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 381 |
+
"model.layers.5.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 382 |
+
"model.layers.5.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 383 |
+
"model.layers.5.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
|
| 384 |
+
"model.layers.5.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
|
| 385 |
+
"model.layers.5.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
|
| 386 |
+
"model.layers.5.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
|
| 387 |
+
"model.layers.5.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
|
| 388 |
+
"model.layers.5.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
|
| 389 |
+
"model.layers.5.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
|
| 390 |
+
"model.layers.5.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
|
| 391 |
+
"model.layers.6.input_layernorm.weight": "model-00001-of-00002.safetensors",
|
| 392 |
+
"model.layers.6.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 393 |
+
"model.layers.6.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 394 |
+
"model.layers.6.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 395 |
+
"model.layers.6.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
|
| 396 |
+
"model.layers.6.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
|
| 397 |
+
"model.layers.6.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
|
| 398 |
+
"model.layers.6.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
|
| 399 |
+
"model.layers.6.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
|
| 400 |
+
"model.layers.6.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
|
| 401 |
+
"model.layers.6.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
|
| 402 |
+
"model.layers.6.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
|
| 403 |
+
"model.layers.7.input_layernorm.weight": "model-00001-of-00002.safetensors",
|
| 404 |
+
"model.layers.7.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 405 |
+
"model.layers.7.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 406 |
+
"model.layers.7.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 407 |
+
"model.layers.7.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
|
| 408 |
+
"model.layers.7.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
|
| 409 |
+
"model.layers.7.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
|
| 410 |
+
"model.layers.7.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
|
| 411 |
+
"model.layers.7.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
|
| 412 |
+
"model.layers.7.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
|
| 413 |
+
"model.layers.7.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
|
| 414 |
+
"model.layers.7.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
|
| 415 |
+
"model.layers.8.input_layernorm.weight": "model-00001-of-00002.safetensors",
|
| 416 |
+
"model.layers.8.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 417 |
+
"model.layers.8.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 418 |
+
"model.layers.8.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 419 |
+
"model.layers.8.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
|
| 420 |
+
"model.layers.8.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
|
| 421 |
+
"model.layers.8.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
|
| 422 |
+
"model.layers.8.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
|
| 423 |
+
"model.layers.8.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
|
| 424 |
+
"model.layers.8.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
|
| 425 |
+
"model.layers.8.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
|
| 426 |
+
"model.layers.8.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
|
| 427 |
+
"model.layers.9.input_layernorm.weight": "model-00001-of-00002.safetensors",
|
| 428 |
+
"model.layers.9.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 429 |
+
"model.layers.9.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 430 |
+
"model.layers.9.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 431 |
+
"model.layers.9.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
|
| 432 |
+
"model.layers.9.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
|
| 433 |
+
"model.layers.9.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
|
| 434 |
+
"model.layers.9.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
|
| 435 |
+
"model.layers.9.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
|
| 436 |
+
"model.layers.9.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
|
| 437 |
+
"model.layers.9.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
|
| 438 |
+
"model.layers.9.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
|
| 439 |
+
"model.norm.weight": "model-00002-of-00002.safetensors",
|
| 440 |
+
"visual.blocks.0.attn.proj.bias": "model-00001-of-00002.safetensors",
|
| 441 |
+
"visual.blocks.0.attn.proj.weight": "model-00001-of-00002.safetensors",
|
| 442 |
+
"visual.blocks.0.attn.qkv.bias": "model-00001-of-00002.safetensors",
|
| 443 |
+
"visual.blocks.0.attn.qkv.weight": "model-00001-of-00002.safetensors",
|
| 444 |
+
"visual.blocks.0.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
|
| 445 |
+
"visual.blocks.0.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 446 |
+
"visual.blocks.0.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
|
| 447 |
+
"visual.blocks.0.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 448 |
+
"visual.blocks.0.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
|
| 449 |
+
"visual.blocks.0.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 450 |
+
"visual.blocks.0.norm1.weight": "model-00001-of-00002.safetensors",
|
| 451 |
+
"visual.blocks.0.norm2.weight": "model-00001-of-00002.safetensors",
|
| 452 |
+
"visual.blocks.1.attn.proj.bias": "model-00001-of-00002.safetensors",
|
| 453 |
+
"visual.blocks.1.attn.proj.weight": "model-00001-of-00002.safetensors",
|
| 454 |
+
"visual.blocks.1.attn.qkv.bias": "model-00001-of-00002.safetensors",
|
| 455 |
+
"visual.blocks.1.attn.qkv.weight": "model-00001-of-00002.safetensors",
|
| 456 |
+
"visual.blocks.1.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
|
| 457 |
+
"visual.blocks.1.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 458 |
+
"visual.blocks.1.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
|
| 459 |
+
"visual.blocks.1.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 460 |
+
"visual.blocks.1.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
|
| 461 |
+
"visual.blocks.1.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 462 |
+
"visual.blocks.1.norm1.weight": "model-00001-of-00002.safetensors",
|
| 463 |
+
"visual.blocks.1.norm2.weight": "model-00001-of-00002.safetensors",
|
| 464 |
+
"visual.blocks.10.attn.proj.bias": "model-00001-of-00002.safetensors",
|
| 465 |
+
"visual.blocks.10.attn.proj.weight": "model-00001-of-00002.safetensors",
|
| 466 |
+
"visual.blocks.10.attn.qkv.bias": "model-00001-of-00002.safetensors",
|
| 467 |
+
"visual.blocks.10.attn.qkv.weight": "model-00001-of-00002.safetensors",
|
| 468 |
+
"visual.blocks.10.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
|
| 469 |
+
"visual.blocks.10.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 470 |
+
"visual.blocks.10.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
|
| 471 |
+
"visual.blocks.10.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 472 |
+
"visual.blocks.10.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
|
| 473 |
+
"visual.blocks.10.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 474 |
+
"visual.blocks.10.norm1.weight": "model-00001-of-00002.safetensors",
|
| 475 |
+
"visual.blocks.10.norm2.weight": "model-00001-of-00002.safetensors",
|
| 476 |
+
"visual.blocks.11.attn.proj.bias": "model-00001-of-00002.safetensors",
|
| 477 |
+
"visual.blocks.11.attn.proj.weight": "model-00001-of-00002.safetensors",
|
| 478 |
+
"visual.blocks.11.attn.qkv.bias": "model-00001-of-00002.safetensors",
|
| 479 |
+
"visual.blocks.11.attn.qkv.weight": "model-00001-of-00002.safetensors",
|
| 480 |
+
"visual.blocks.11.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
|
| 481 |
+
"visual.blocks.11.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 482 |
+
"visual.blocks.11.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
|
| 483 |
+
"visual.blocks.11.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 484 |
+
"visual.blocks.11.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
|
| 485 |
+
"visual.blocks.11.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 486 |
+
"visual.blocks.11.norm1.weight": "model-00001-of-00002.safetensors",
|
| 487 |
+
"visual.blocks.11.norm2.weight": "model-00001-of-00002.safetensors",
|
| 488 |
+
"visual.blocks.12.attn.proj.bias": "model-00001-of-00002.safetensors",
|
| 489 |
+
"visual.blocks.12.attn.proj.weight": "model-00001-of-00002.safetensors",
|
| 490 |
+
"visual.blocks.12.attn.qkv.bias": "model-00001-of-00002.safetensors",
|
| 491 |
+
"visual.blocks.12.attn.qkv.weight": "model-00001-of-00002.safetensors",
|
| 492 |
+
"visual.blocks.12.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
|
| 493 |
+
"visual.blocks.12.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 494 |
+
"visual.blocks.12.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
|
| 495 |
+
"visual.blocks.12.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 496 |
+
"visual.blocks.12.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
|
| 497 |
+
"visual.blocks.12.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 498 |
+
"visual.blocks.12.norm1.weight": "model-00001-of-00002.safetensors",
|
| 499 |
+
"visual.blocks.12.norm2.weight": "model-00001-of-00002.safetensors",
|
| 500 |
+
"visual.blocks.13.attn.proj.bias": "model-00001-of-00002.safetensors",
|
| 501 |
+
"visual.blocks.13.attn.proj.weight": "model-00001-of-00002.safetensors",
|
| 502 |
+
"visual.blocks.13.attn.qkv.bias": "model-00001-of-00002.safetensors",
|
| 503 |
+
"visual.blocks.13.attn.qkv.weight": "model-00001-of-00002.safetensors",
|
| 504 |
+
"visual.blocks.13.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
|
| 505 |
+
"visual.blocks.13.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 506 |
+
"visual.blocks.13.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
|
| 507 |
+
"visual.blocks.13.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 508 |
+
"visual.blocks.13.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
|
| 509 |
+
"visual.blocks.13.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 510 |
+
"visual.blocks.13.norm1.weight": "model-00001-of-00002.safetensors",
|
| 511 |
+
"visual.blocks.13.norm2.weight": "model-00001-of-00002.safetensors",
|
| 512 |
+
"visual.blocks.14.attn.proj.bias": "model-00001-of-00002.safetensors",
|
| 513 |
+
"visual.blocks.14.attn.proj.weight": "model-00001-of-00002.safetensors",
|
| 514 |
+
"visual.blocks.14.attn.qkv.bias": "model-00001-of-00002.safetensors",
|
| 515 |
+
"visual.blocks.14.attn.qkv.weight": "model-00001-of-00002.safetensors",
|
| 516 |
+
"visual.blocks.14.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
|
| 517 |
+
"visual.blocks.14.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 518 |
+
"visual.blocks.14.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
|
| 519 |
+
"visual.blocks.14.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 520 |
+
"visual.blocks.14.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
|
| 521 |
+
"visual.blocks.14.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 522 |
+
"visual.blocks.14.norm1.weight": "model-00001-of-00002.safetensors",
|
| 523 |
+
"visual.blocks.14.norm2.weight": "model-00001-of-00002.safetensors",
|
| 524 |
+
"visual.blocks.15.attn.proj.bias": "model-00001-of-00002.safetensors",
|
| 525 |
+
"visual.blocks.15.attn.proj.weight": "model-00001-of-00002.safetensors",
|
| 526 |
+
"visual.blocks.15.attn.qkv.bias": "model-00001-of-00002.safetensors",
|
| 527 |
+
"visual.blocks.15.attn.qkv.weight": "model-00001-of-00002.safetensors",
|
| 528 |
+
"visual.blocks.15.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
|
| 529 |
+
"visual.blocks.15.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 530 |
+
"visual.blocks.15.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
|
| 531 |
+
"visual.blocks.15.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 532 |
+
"visual.blocks.15.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
|
| 533 |
+
"visual.blocks.15.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 534 |
+
"visual.blocks.15.norm1.weight": "model-00001-of-00002.safetensors",
|
| 535 |
+
"visual.blocks.15.norm2.weight": "model-00001-of-00002.safetensors",
|
| 536 |
+
"visual.blocks.16.attn.proj.bias": "model-00001-of-00002.safetensors",
|
| 537 |
+
"visual.blocks.16.attn.proj.weight": "model-00001-of-00002.safetensors",
|
| 538 |
+
"visual.blocks.16.attn.qkv.bias": "model-00001-of-00002.safetensors",
|
| 539 |
+
"visual.blocks.16.attn.qkv.weight": "model-00001-of-00002.safetensors",
|
| 540 |
+
"visual.blocks.16.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
|
| 541 |
+
"visual.blocks.16.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 542 |
+
"visual.blocks.16.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
|
| 543 |
+
"visual.blocks.16.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 544 |
+
"visual.blocks.16.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
|
| 545 |
+
"visual.blocks.16.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 546 |
+
"visual.blocks.16.norm1.weight": "model-00001-of-00002.safetensors",
|
| 547 |
+
"visual.blocks.16.norm2.weight": "model-00001-of-00002.safetensors",
|
| 548 |
+
"visual.blocks.17.attn.proj.bias": "model-00001-of-00002.safetensors",
|
| 549 |
+
"visual.blocks.17.attn.proj.weight": "model-00001-of-00002.safetensors",
|
| 550 |
+
"visual.blocks.17.attn.qkv.bias": "model-00001-of-00002.safetensors",
|
| 551 |
+
"visual.blocks.17.attn.qkv.weight": "model-00001-of-00002.safetensors",
|
| 552 |
+
"visual.blocks.17.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
|
| 553 |
+
"visual.blocks.17.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 554 |
+
"visual.blocks.17.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
|
| 555 |
+
"visual.blocks.17.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 556 |
+
"visual.blocks.17.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
|
| 557 |
+
"visual.blocks.17.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 558 |
+
"visual.blocks.17.norm1.weight": "model-00001-of-00002.safetensors",
|
| 559 |
+
"visual.blocks.17.norm2.weight": "model-00001-of-00002.safetensors",
|
| 560 |
+
"visual.blocks.18.attn.proj.bias": "model-00001-of-00002.safetensors",
|
| 561 |
+
"visual.blocks.18.attn.proj.weight": "model-00001-of-00002.safetensors",
|
| 562 |
+
"visual.blocks.18.attn.qkv.bias": "model-00001-of-00002.safetensors",
|
| 563 |
+
"visual.blocks.18.attn.qkv.weight": "model-00001-of-00002.safetensors",
|
| 564 |
+
"visual.blocks.18.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
|
| 565 |
+
"visual.blocks.18.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 566 |
+
"visual.blocks.18.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
|
| 567 |
+
"visual.blocks.18.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 568 |
+
"visual.blocks.18.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
|
| 569 |
+
"visual.blocks.18.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 570 |
+
"visual.blocks.18.norm1.weight": "model-00001-of-00002.safetensors",
|
| 571 |
+
"visual.blocks.18.norm2.weight": "model-00001-of-00002.safetensors",
|
| 572 |
+
"visual.blocks.19.attn.proj.bias": "model-00001-of-00002.safetensors",
|
| 573 |
+
"visual.blocks.19.attn.proj.weight": "model-00001-of-00002.safetensors",
|
| 574 |
+
"visual.blocks.19.attn.qkv.bias": "model-00001-of-00002.safetensors",
|
| 575 |
+
"visual.blocks.19.attn.qkv.weight": "model-00001-of-00002.safetensors",
|
| 576 |
+
"visual.blocks.19.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
|
| 577 |
+
"visual.blocks.19.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 578 |
+
"visual.blocks.19.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
|
| 579 |
+
"visual.blocks.19.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 580 |
+
"visual.blocks.19.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
|
| 581 |
+
"visual.blocks.19.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 582 |
+
"visual.blocks.19.norm1.weight": "model-00001-of-00002.safetensors",
|
| 583 |
+
"visual.blocks.19.norm2.weight": "model-00001-of-00002.safetensors",
|
| 584 |
+
"visual.blocks.2.attn.proj.bias": "model-00001-of-00002.safetensors",
|
| 585 |
+
"visual.blocks.2.attn.proj.weight": "model-00001-of-00002.safetensors",
|
| 586 |
+
"visual.blocks.2.attn.qkv.bias": "model-00001-of-00002.safetensors",
|
| 587 |
+
"visual.blocks.2.attn.qkv.weight": "model-00001-of-00002.safetensors",
|
| 588 |
+
"visual.blocks.2.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
|
| 589 |
+
"visual.blocks.2.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 590 |
+
"visual.blocks.2.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
|
| 591 |
+
"visual.blocks.2.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 592 |
+
"visual.blocks.2.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
|
| 593 |
+
"visual.blocks.2.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 594 |
+
"visual.blocks.2.norm1.weight": "model-00001-of-00002.safetensors",
|
| 595 |
+
"visual.blocks.2.norm2.weight": "model-00001-of-00002.safetensors",
|
| 596 |
+
"visual.blocks.20.attn.proj.bias": "model-00001-of-00002.safetensors",
|
| 597 |
+
"visual.blocks.20.attn.proj.weight": "model-00001-of-00002.safetensors",
|
| 598 |
+
"visual.blocks.20.attn.qkv.bias": "model-00001-of-00002.safetensors",
|
| 599 |
+
"visual.blocks.20.attn.qkv.weight": "model-00001-of-00002.safetensors",
|
| 600 |
+
"visual.blocks.20.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
|
| 601 |
+
"visual.blocks.20.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 602 |
+
"visual.blocks.20.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
|
| 603 |
+
"visual.blocks.20.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 604 |
+
"visual.blocks.20.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
|
| 605 |
+
"visual.blocks.20.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 606 |
+
"visual.blocks.20.norm1.weight": "model-00001-of-00002.safetensors",
|
| 607 |
+
"visual.blocks.20.norm2.weight": "model-00001-of-00002.safetensors",
|
| 608 |
+
"visual.blocks.21.attn.proj.bias": "model-00001-of-00002.safetensors",
|
| 609 |
+
"visual.blocks.21.attn.proj.weight": "model-00001-of-00002.safetensors",
|
| 610 |
+
"visual.blocks.21.attn.qkv.bias": "model-00001-of-00002.safetensors",
|
| 611 |
+
"visual.blocks.21.attn.qkv.weight": "model-00001-of-00002.safetensors",
|
| 612 |
+
"visual.blocks.21.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
|
| 613 |
+
"visual.blocks.21.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 614 |
+
"visual.blocks.21.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
|
| 615 |
+
"visual.blocks.21.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 616 |
+
"visual.blocks.21.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
|
| 617 |
+
"visual.blocks.21.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 618 |
+
"visual.blocks.21.norm1.weight": "model-00001-of-00002.safetensors",
|
| 619 |
+
"visual.blocks.21.norm2.weight": "model-00001-of-00002.safetensors",
|
| 620 |
+
"visual.blocks.22.attn.proj.bias": "model-00001-of-00002.safetensors",
|
| 621 |
+
"visual.blocks.22.attn.proj.weight": "model-00001-of-00002.safetensors",
|
| 622 |
+
"visual.blocks.22.attn.qkv.bias": "model-00001-of-00002.safetensors",
|
| 623 |
+
"visual.blocks.22.attn.qkv.weight": "model-00001-of-00002.safetensors",
|
| 624 |
+
"visual.blocks.22.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
|
| 625 |
+
"visual.blocks.22.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 626 |
+
"visual.blocks.22.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
|
| 627 |
+
"visual.blocks.22.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 628 |
+
"visual.blocks.22.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
|
| 629 |
+
"visual.blocks.22.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 630 |
+
"visual.blocks.22.norm1.weight": "model-00001-of-00002.safetensors",
|
| 631 |
+
"visual.blocks.22.norm2.weight": "model-00001-of-00002.safetensors",
|
| 632 |
+
"visual.blocks.23.attn.proj.bias": "model-00001-of-00002.safetensors",
|
| 633 |
+
"visual.blocks.23.attn.proj.weight": "model-00001-of-00002.safetensors",
|
| 634 |
+
"visual.blocks.23.attn.qkv.bias": "model-00001-of-00002.safetensors",
|
| 635 |
+
"visual.blocks.23.attn.qkv.weight": "model-00001-of-00002.safetensors",
|
| 636 |
+
"visual.blocks.23.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
|
| 637 |
+
"visual.blocks.23.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 638 |
+
"visual.blocks.23.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
|
| 639 |
+
"visual.blocks.23.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 640 |
+
"visual.blocks.23.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
|
| 641 |
+
"visual.blocks.23.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 642 |
+
"visual.blocks.23.norm1.weight": "model-00001-of-00002.safetensors",
|
| 643 |
+
"visual.blocks.23.norm2.weight": "model-00001-of-00002.safetensors",
|
| 644 |
+
"visual.blocks.24.attn.proj.bias": "model-00001-of-00002.safetensors",
|
| 645 |
+
"visual.blocks.24.attn.proj.weight": "model-00001-of-00002.safetensors",
|
| 646 |
+
"visual.blocks.24.attn.qkv.bias": "model-00001-of-00002.safetensors",
|
| 647 |
+
"visual.blocks.24.attn.qkv.weight": "model-00001-of-00002.safetensors",
|
| 648 |
+
"visual.blocks.24.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
|
| 649 |
+
"visual.blocks.24.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 650 |
+
"visual.blocks.24.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
|
| 651 |
+
"visual.blocks.24.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 652 |
+
"visual.blocks.24.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
|
| 653 |
+
"visual.blocks.24.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 654 |
+
"visual.blocks.24.norm1.weight": "model-00001-of-00002.safetensors",
|
| 655 |
+
"visual.blocks.24.norm2.weight": "model-00001-of-00002.safetensors",
|
| 656 |
+
"visual.blocks.25.attn.proj.bias": "model-00001-of-00002.safetensors",
|
| 657 |
+
"visual.blocks.25.attn.proj.weight": "model-00001-of-00002.safetensors",
|
| 658 |
+
"visual.blocks.25.attn.qkv.bias": "model-00001-of-00002.safetensors",
|
| 659 |
+
"visual.blocks.25.attn.qkv.weight": "model-00001-of-00002.safetensors",
|
| 660 |
+
"visual.blocks.25.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
|
| 661 |
+
"visual.blocks.25.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 662 |
+
"visual.blocks.25.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
|
| 663 |
+
"visual.blocks.25.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 664 |
+
"visual.blocks.25.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
|
| 665 |
+
"visual.blocks.25.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 666 |
+
"visual.blocks.25.norm1.weight": "model-00001-of-00002.safetensors",
|
| 667 |
+
"visual.blocks.25.norm2.weight": "model-00001-of-00002.safetensors",
|
| 668 |
+
"visual.blocks.26.attn.proj.bias": "model-00001-of-00002.safetensors",
|
| 669 |
+
"visual.blocks.26.attn.proj.weight": "model-00001-of-00002.safetensors",
|
| 670 |
+
"visual.blocks.26.attn.qkv.bias": "model-00001-of-00002.safetensors",
|
| 671 |
+
"visual.blocks.26.attn.qkv.weight": "model-00001-of-00002.safetensors",
|
| 672 |
+
"visual.blocks.26.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
|
| 673 |
+
"visual.blocks.26.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 674 |
+
"visual.blocks.26.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
|
| 675 |
+
"visual.blocks.26.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 676 |
+
"visual.blocks.26.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
|
| 677 |
+
"visual.blocks.26.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 678 |
+
"visual.blocks.26.norm1.weight": "model-00001-of-00002.safetensors",
|
| 679 |
+
"visual.blocks.26.norm2.weight": "model-00001-of-00002.safetensors",
|
| 680 |
+
"visual.blocks.27.attn.proj.bias": "model-00001-of-00002.safetensors",
|
| 681 |
+
"visual.blocks.27.attn.proj.weight": "model-00001-of-00002.safetensors",
|
| 682 |
+
"visual.blocks.27.attn.qkv.bias": "model-00001-of-00002.safetensors",
|
| 683 |
+
"visual.blocks.27.attn.qkv.weight": "model-00001-of-00002.safetensors",
|
| 684 |
+
"visual.blocks.27.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
|
| 685 |
+
"visual.blocks.27.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 686 |
+
"visual.blocks.27.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
|
| 687 |
+
"visual.blocks.27.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 688 |
+
"visual.blocks.27.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
|
| 689 |
+
"visual.blocks.27.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 690 |
+
"visual.blocks.27.norm1.weight": "model-00001-of-00002.safetensors",
|
| 691 |
+
"visual.blocks.27.norm2.weight": "model-00001-of-00002.safetensors",
|
| 692 |
+
"visual.blocks.28.attn.proj.bias": "model-00001-of-00002.safetensors",
|
| 693 |
+
"visual.blocks.28.attn.proj.weight": "model-00001-of-00002.safetensors",
|
| 694 |
+
"visual.blocks.28.attn.qkv.bias": "model-00001-of-00002.safetensors",
|
| 695 |
+
"visual.blocks.28.attn.qkv.weight": "model-00001-of-00002.safetensors",
|
| 696 |
+
"visual.blocks.28.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
|
| 697 |
+
"visual.blocks.28.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 698 |
+
"visual.blocks.28.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
|
| 699 |
+
"visual.blocks.28.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 700 |
+
"visual.blocks.28.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
|
| 701 |
+
"visual.blocks.28.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 702 |
+
"visual.blocks.28.norm1.weight": "model-00001-of-00002.safetensors",
|
| 703 |
+
"visual.blocks.28.norm2.weight": "model-00001-of-00002.safetensors",
|
| 704 |
+
"visual.blocks.29.attn.proj.bias": "model-00001-of-00002.safetensors",
|
| 705 |
+
"visual.blocks.29.attn.proj.weight": "model-00001-of-00002.safetensors",
|
| 706 |
+
"visual.blocks.29.attn.qkv.bias": "model-00001-of-00002.safetensors",
|
| 707 |
+
"visual.blocks.29.attn.qkv.weight": "model-00001-of-00002.safetensors",
|
| 708 |
+
"visual.blocks.29.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
|
| 709 |
+
"visual.blocks.29.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 710 |
+
"visual.blocks.29.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
|
| 711 |
+
"visual.blocks.29.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 712 |
+
"visual.blocks.29.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
|
| 713 |
+
"visual.blocks.29.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 714 |
+
"visual.blocks.29.norm1.weight": "model-00001-of-00002.safetensors",
|
| 715 |
+
"visual.blocks.29.norm2.weight": "model-00001-of-00002.safetensors",
|
| 716 |
+
"visual.blocks.3.attn.proj.bias": "model-00001-of-00002.safetensors",
|
| 717 |
+
"visual.blocks.3.attn.proj.weight": "model-00001-of-00002.safetensors",
|
| 718 |
+
"visual.blocks.3.attn.qkv.bias": "model-00001-of-00002.safetensors",
|
| 719 |
+
"visual.blocks.3.attn.qkv.weight": "model-00001-of-00002.safetensors",
|
| 720 |
+
"visual.blocks.3.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
|
| 721 |
+
"visual.blocks.3.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 722 |
+
"visual.blocks.3.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
|
| 723 |
+
"visual.blocks.3.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 724 |
+
"visual.blocks.3.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
|
| 725 |
+
"visual.blocks.3.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 726 |
+
"visual.blocks.3.norm1.weight": "model-00001-of-00002.safetensors",
|
| 727 |
+
"visual.blocks.3.norm2.weight": "model-00001-of-00002.safetensors",
|
| 728 |
+
"visual.blocks.30.attn.proj.bias": "model-00001-of-00002.safetensors",
|
| 729 |
+
"visual.blocks.30.attn.proj.weight": "model-00001-of-00002.safetensors",
|
| 730 |
+
"visual.blocks.30.attn.qkv.bias": "model-00001-of-00002.safetensors",
|
| 731 |
+
"visual.blocks.30.attn.qkv.weight": "model-00001-of-00002.safetensors",
|
| 732 |
+
"visual.blocks.30.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
|
| 733 |
+
"visual.blocks.30.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 734 |
+
"visual.blocks.30.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
|
| 735 |
+
"visual.blocks.30.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 736 |
+
"visual.blocks.30.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
|
| 737 |
+
"visual.blocks.30.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 738 |
+
"visual.blocks.30.norm1.weight": "model-00001-of-00002.safetensors",
|
| 739 |
+
"visual.blocks.30.norm2.weight": "model-00001-of-00002.safetensors",
|
| 740 |
+
"visual.blocks.31.attn.proj.bias": "model-00001-of-00002.safetensors",
|
| 741 |
+
"visual.blocks.31.attn.proj.weight": "model-00001-of-00002.safetensors",
|
| 742 |
+
"visual.blocks.31.attn.qkv.bias": "model-00001-of-00002.safetensors",
|
| 743 |
+
"visual.blocks.31.attn.qkv.weight": "model-00001-of-00002.safetensors",
|
| 744 |
+
"visual.blocks.31.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
|
| 745 |
+
"visual.blocks.31.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 746 |
+
"visual.blocks.31.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
|
| 747 |
+
"visual.blocks.31.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 748 |
+
"visual.blocks.31.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
|
| 749 |
+
"visual.blocks.31.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 750 |
+
"visual.blocks.31.norm1.weight": "model-00001-of-00002.safetensors",
|
| 751 |
+
"visual.blocks.31.norm2.weight": "model-00001-of-00002.safetensors",
|
| 752 |
+
"visual.blocks.4.attn.proj.bias": "model-00001-of-00002.safetensors",
|
| 753 |
+
"visual.blocks.4.attn.proj.weight": "model-00001-of-00002.safetensors",
|
| 754 |
+
"visual.blocks.4.attn.qkv.bias": "model-00001-of-00002.safetensors",
|
| 755 |
+
"visual.blocks.4.attn.qkv.weight": "model-00001-of-00002.safetensors",
|
| 756 |
+
"visual.blocks.4.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
|
| 757 |
+
"visual.blocks.4.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 758 |
+
"visual.blocks.4.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
|
| 759 |
+
"visual.blocks.4.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 760 |
+
"visual.blocks.4.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
|
| 761 |
+
"visual.blocks.4.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 762 |
+
"visual.blocks.4.norm1.weight": "model-00001-of-00002.safetensors",
|
| 763 |
+
"visual.blocks.4.norm2.weight": "model-00001-of-00002.safetensors",
|
| 764 |
+
"visual.blocks.5.attn.proj.bias": "model-00001-of-00002.safetensors",
|
| 765 |
+
"visual.blocks.5.attn.proj.weight": "model-00001-of-00002.safetensors",
|
| 766 |
+
"visual.blocks.5.attn.qkv.bias": "model-00001-of-00002.safetensors",
|
| 767 |
+
"visual.blocks.5.attn.qkv.weight": "model-00001-of-00002.safetensors",
|
| 768 |
+
"visual.blocks.5.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
|
| 769 |
+
"visual.blocks.5.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 770 |
+
"visual.blocks.5.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
|
| 771 |
+
"visual.blocks.5.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 772 |
+
"visual.blocks.5.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
|
| 773 |
+
"visual.blocks.5.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 774 |
+
"visual.blocks.5.norm1.weight": "model-00001-of-00002.safetensors",
|
| 775 |
+
"visual.blocks.5.norm2.weight": "model-00001-of-00002.safetensors",
|
| 776 |
+
"visual.blocks.6.attn.proj.bias": "model-00001-of-00002.safetensors",
|
| 777 |
+
"visual.blocks.6.attn.proj.weight": "model-00001-of-00002.safetensors",
|
| 778 |
+
"visual.blocks.6.attn.qkv.bias": "model-00001-of-00002.safetensors",
|
| 779 |
+
"visual.blocks.6.attn.qkv.weight": "model-00001-of-00002.safetensors",
|
| 780 |
+
"visual.blocks.6.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
|
| 781 |
+
"visual.blocks.6.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 782 |
+
"visual.blocks.6.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
|
| 783 |
+
"visual.blocks.6.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 784 |
+
"visual.blocks.6.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
|
| 785 |
+
"visual.blocks.6.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 786 |
+
"visual.blocks.6.norm1.weight": "model-00001-of-00002.safetensors",
|
| 787 |
+
"visual.blocks.6.norm2.weight": "model-00001-of-00002.safetensors",
|
| 788 |
+
"visual.blocks.7.attn.proj.bias": "model-00001-of-00002.safetensors",
|
| 789 |
+
"visual.blocks.7.attn.proj.weight": "model-00001-of-00002.safetensors",
|
| 790 |
+
"visual.blocks.7.attn.qkv.bias": "model-00001-of-00002.safetensors",
|
| 791 |
+
"visual.blocks.7.attn.qkv.weight": "model-00001-of-00002.safetensors",
|
| 792 |
+
"visual.blocks.7.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
|
| 793 |
+
"visual.blocks.7.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 794 |
+
"visual.blocks.7.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
|
| 795 |
+
"visual.blocks.7.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 796 |
+
"visual.blocks.7.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
|
| 797 |
+
"visual.blocks.7.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 798 |
+
"visual.blocks.7.norm1.weight": "model-00001-of-00002.safetensors",
|
| 799 |
+
"visual.blocks.7.norm2.weight": "model-00001-of-00002.safetensors",
|
| 800 |
+
"visual.blocks.8.attn.proj.bias": "model-00001-of-00002.safetensors",
|
| 801 |
+
"visual.blocks.8.attn.proj.weight": "model-00001-of-00002.safetensors",
|
| 802 |
+
"visual.blocks.8.attn.qkv.bias": "model-00001-of-00002.safetensors",
|
| 803 |
+
"visual.blocks.8.attn.qkv.weight": "model-00001-of-00002.safetensors",
|
| 804 |
+
"visual.blocks.8.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
|
| 805 |
+
"visual.blocks.8.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 806 |
+
"visual.blocks.8.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
|
| 807 |
+
"visual.blocks.8.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 808 |
+
"visual.blocks.8.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
|
| 809 |
+
"visual.blocks.8.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 810 |
+
"visual.blocks.8.norm1.weight": "model-00001-of-00002.safetensors",
|
| 811 |
+
"visual.blocks.8.norm2.weight": "model-00001-of-00002.safetensors",
|
| 812 |
+
"visual.blocks.9.attn.proj.bias": "model-00001-of-00002.safetensors",
|
| 813 |
+
"visual.blocks.9.attn.proj.weight": "model-00001-of-00002.safetensors",
|
| 814 |
+
"visual.blocks.9.attn.qkv.bias": "model-00001-of-00002.safetensors",
|
| 815 |
+
"visual.blocks.9.attn.qkv.weight": "model-00001-of-00002.safetensors",
|
| 816 |
+
"visual.blocks.9.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
|
| 817 |
+
"visual.blocks.9.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
|
| 818 |
+
"visual.blocks.9.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
|
| 819 |
+
"visual.blocks.9.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
|
| 820 |
+
"visual.blocks.9.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
|
| 821 |
+
"visual.blocks.9.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
|
| 822 |
+
"visual.blocks.9.norm1.weight": "model-00001-of-00002.safetensors",
|
| 823 |
+
"visual.blocks.9.norm2.weight": "model-00001-of-00002.safetensors",
|
| 824 |
+
"visual.merger.ln_q.weight": "model-00001-of-00002.safetensors",
|
| 825 |
+
"visual.merger.mlp.0.bias": "model-00001-of-00002.safetensors",
|
| 826 |
+
"visual.merger.mlp.0.weight": "model-00001-of-00002.safetensors",
|
| 827 |
+
"visual.merger.mlp.2.bias": "model-00001-of-00002.safetensors",
|
| 828 |
+
"visual.merger.mlp.2.weight": "model-00001-of-00002.safetensors",
|
| 829 |
+
"visual.patch_embed.proj.weight": "model-00001-of-00002.safetensors"
|
| 830 |
+
}
|
| 831 |
+
}
|
EXP_1.2_3b/preprocessor_config.json
ADDED
|
@@ -0,0 +1,29 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"do_convert_rgb": true,
|
| 3 |
+
"do_normalize": true,
|
| 4 |
+
"do_rescale": true,
|
| 5 |
+
"do_resize": true,
|
| 6 |
+
"image_mean": [
|
| 7 |
+
0.48145466,
|
| 8 |
+
0.4578275,
|
| 9 |
+
0.40821073
|
| 10 |
+
],
|
| 11 |
+
"image_processor_type": "Qwen2VLImageProcessor",
|
| 12 |
+
"image_std": [
|
| 13 |
+
0.26862954,
|
| 14 |
+
0.26130258,
|
| 15 |
+
0.27577711
|
| 16 |
+
],
|
| 17 |
+
"max_pixels": 12845056,
|
| 18 |
+
"merge_size": 2,
|
| 19 |
+
"min_pixels": 3136,
|
| 20 |
+
"patch_size": 14,
|
| 21 |
+
"processor_class": "Qwen2_5_VLProcessor",
|
| 22 |
+
"resample": 3,
|
| 23 |
+
"rescale_factor": 0.00392156862745098,
|
| 24 |
+
"size": {
|
| 25 |
+
"longest_edge": 12845056,
|
| 26 |
+
"shortest_edge": 3136
|
| 27 |
+
},
|
| 28 |
+
"temporal_patch_size": 2
|
| 29 |
+
}
|
EXP_1.2_3b/special_tokens_map.json
ADDED
|
@@ -0,0 +1,31 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"additional_special_tokens": [
|
| 3 |
+
"<|im_start|>",
|
| 4 |
+
"<|im_end|>",
|
| 5 |
+
"<|object_ref_start|>",
|
| 6 |
+
"<|object_ref_end|>",
|
| 7 |
+
"<|box_start|>",
|
| 8 |
+
"<|box_end|>",
|
| 9 |
+
"<|quad_start|>",
|
| 10 |
+
"<|quad_end|>",
|
| 11 |
+
"<|vision_start|>",
|
| 12 |
+
"<|vision_end|>",
|
| 13 |
+
"<|vision_pad|>",
|
| 14 |
+
"<|image_pad|>",
|
| 15 |
+
"<|video_pad|>"
|
| 16 |
+
],
|
| 17 |
+
"eos_token": {
|
| 18 |
+
"content": "<|im_end|>",
|
| 19 |
+
"lstrip": false,
|
| 20 |
+
"normalized": false,
|
| 21 |
+
"rstrip": false,
|
| 22 |
+
"single_word": false
|
| 23 |
+
},
|
| 24 |
+
"pad_token": {
|
| 25 |
+
"content": "<|endoftext|>",
|
| 26 |
+
"lstrip": false,
|
| 27 |
+
"normalized": false,
|
| 28 |
+
"rstrip": false,
|
| 29 |
+
"single_word": false
|
| 30 |
+
}
|
| 31 |
+
}
|
EXP_1.2_3b/swanlab_public_config.json
ADDED
|
@@ -0,0 +1,13 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"project_name": "LLaMA-Factory",
|
| 3 |
+
"version": "0.6.1",
|
| 4 |
+
"run_id": "run-20250610_121219-a3b1799d",
|
| 5 |
+
"swanlog_dir": "/mnt/nvme/hyz/LLaMA-Factory/swanlog",
|
| 6 |
+
"run_dir": "/mnt/nvme/hyz/LLaMA-Factory/swanlog/run-20250610_121219-a3b1799d",
|
| 7 |
+
"cloud": {
|
| 8 |
+
"project_name": "LLaMA-Factory",
|
| 9 |
+
"project_url": "https://swanlab.cn/@huyuanze/LLaMA-Factory",
|
| 10 |
+
"experiment_name": "/mnt/nvme/hyz/mm_homework/checkpoints_full/EXP_1.2_3b",
|
| 11 |
+
"experiment_url": "https://swanlab.cn/@huyuanze/LLaMA-Factory/runs/skm5tcynsle1qxupe85cz"
|
| 12 |
+
}
|
| 13 |
+
}
|
EXP_1.2_3b/tokenizer.json
ADDED
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:9c5ae00e602b8860cbd784ba82a8aa14e8feecec692e7076590d014d7b7fdafa
|
| 3 |
+
size 11421896
|
EXP_1.2_3b/tokenizer_config.json
ADDED
|
@@ -0,0 +1,209 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"add_bos_token": false,
|
| 3 |
+
"add_prefix_space": false,
|
| 4 |
+
"added_tokens_decoder": {
|
| 5 |
+
"151643": {
|
| 6 |
+
"content": "<|endoftext|>",
|
| 7 |
+
"lstrip": false,
|
| 8 |
+
"normalized": false,
|
| 9 |
+
"rstrip": false,
|
| 10 |
+
"single_word": false,
|
| 11 |
+
"special": true
|
| 12 |
+
},
|
| 13 |
+
"151644": {
|
| 14 |
+
"content": "<|im_start|>",
|
| 15 |
+
"lstrip": false,
|
| 16 |
+
"normalized": false,
|
| 17 |
+
"rstrip": false,
|
| 18 |
+
"single_word": false,
|
| 19 |
+
"special": true
|
| 20 |
+
},
|
| 21 |
+
"151645": {
|
| 22 |
+
"content": "<|im_end|>",
|
| 23 |
+
"lstrip": false,
|
| 24 |
+
"normalized": false,
|
| 25 |
+
"rstrip": false,
|
| 26 |
+
"single_word": false,
|
| 27 |
+
"special": true
|
| 28 |
+
},
|
| 29 |
+
"151646": {
|
| 30 |
+
"content": "<|object_ref_start|>",
|
| 31 |
+
"lstrip": false,
|
| 32 |
+
"normalized": false,
|
| 33 |
+
"rstrip": false,
|
| 34 |
+
"single_word": false,
|
| 35 |
+
"special": true
|
| 36 |
+
},
|
| 37 |
+
"151647": {
|
| 38 |
+
"content": "<|object_ref_end|>",
|
| 39 |
+
"lstrip": false,
|
| 40 |
+
"normalized": false,
|
| 41 |
+
"rstrip": false,
|
| 42 |
+
"single_word": false,
|
| 43 |
+
"special": true
|
| 44 |
+
},
|
| 45 |
+
"151648": {
|
| 46 |
+
"content": "<|box_start|>",
|
| 47 |
+
"lstrip": false,
|
| 48 |
+
"normalized": false,
|
| 49 |
+
"rstrip": false,
|
| 50 |
+
"single_word": false,
|
| 51 |
+
"special": true
|
| 52 |
+
},
|
| 53 |
+
"151649": {
|
| 54 |
+
"content": "<|box_end|>",
|
| 55 |
+
"lstrip": false,
|
| 56 |
+
"normalized": false,
|
| 57 |
+
"rstrip": false,
|
| 58 |
+
"single_word": false,
|
| 59 |
+
"special": true
|
| 60 |
+
},
|
| 61 |
+
"151650": {
|
| 62 |
+
"content": "<|quad_start|>",
|
| 63 |
+
"lstrip": false,
|
| 64 |
+
"normalized": false,
|
| 65 |
+
"rstrip": false,
|
| 66 |
+
"single_word": false,
|
| 67 |
+
"special": true
|
| 68 |
+
},
|
| 69 |
+
"151651": {
|
| 70 |
+
"content": "<|quad_end|>",
|
| 71 |
+
"lstrip": false,
|
| 72 |
+
"normalized": false,
|
| 73 |
+
"rstrip": false,
|
| 74 |
+
"single_word": false,
|
| 75 |
+
"special": true
|
| 76 |
+
},
|
| 77 |
+
"151652": {
|
| 78 |
+
"content": "<|vision_start|>",
|
| 79 |
+
"lstrip": false,
|
| 80 |
+
"normalized": false,
|
| 81 |
+
"rstrip": false,
|
| 82 |
+
"single_word": false,
|
| 83 |
+
"special": true
|
| 84 |
+
},
|
| 85 |
+
"151653": {
|
| 86 |
+
"content": "<|vision_end|>",
|
| 87 |
+
"lstrip": false,
|
| 88 |
+
"normalized": false,
|
| 89 |
+
"rstrip": false,
|
| 90 |
+
"single_word": false,
|
| 91 |
+
"special": true
|
| 92 |
+
},
|
| 93 |
+
"151654": {
|
| 94 |
+
"content": "<|vision_pad|>",
|
| 95 |
+
"lstrip": false,
|
| 96 |
+
"normalized": false,
|
| 97 |
+
"rstrip": false,
|
| 98 |
+
"single_word": false,
|
| 99 |
+
"special": true
|
| 100 |
+
},
|
| 101 |
+
"151655": {
|
| 102 |
+
"content": "<|image_pad|>",
|
| 103 |
+
"lstrip": false,
|
| 104 |
+
"normalized": false,
|
| 105 |
+
"rstrip": false,
|
| 106 |
+
"single_word": false,
|
| 107 |
+
"special": true
|
| 108 |
+
},
|
| 109 |
+
"151656": {
|
| 110 |
+
"content": "<|video_pad|>",
|
| 111 |
+
"lstrip": false,
|
| 112 |
+
"normalized": false,
|
| 113 |
+
"rstrip": false,
|
| 114 |
+
"single_word": false,
|
| 115 |
+
"special": true
|
| 116 |
+
},
|
| 117 |
+
"151657": {
|
| 118 |
+
"content": "<tool_call>",
|
| 119 |
+
"lstrip": false,
|
| 120 |
+
"normalized": false,
|
| 121 |
+
"rstrip": false,
|
| 122 |
+
"single_word": false,
|
| 123 |
+
"special": false
|
| 124 |
+
},
|
| 125 |
+
"151658": {
|
| 126 |
+
"content": "</tool_call>",
|
| 127 |
+
"lstrip": false,
|
| 128 |
+
"normalized": false,
|
| 129 |
+
"rstrip": false,
|
| 130 |
+
"single_word": false,
|
| 131 |
+
"special": false
|
| 132 |
+
},
|
| 133 |
+
"151659": {
|
| 134 |
+
"content": "<|fim_prefix|>",
|
| 135 |
+
"lstrip": false,
|
| 136 |
+
"normalized": false,
|
| 137 |
+
"rstrip": false,
|
| 138 |
+
"single_word": false,
|
| 139 |
+
"special": false
|
| 140 |
+
},
|
| 141 |
+
"151660": {
|
| 142 |
+
"content": "<|fim_middle|>",
|
| 143 |
+
"lstrip": false,
|
| 144 |
+
"normalized": false,
|
| 145 |
+
"rstrip": false,
|
| 146 |
+
"single_word": false,
|
| 147 |
+
"special": false
|
| 148 |
+
},
|
| 149 |
+
"151661": {
|
| 150 |
+
"content": "<|fim_suffix|>",
|
| 151 |
+
"lstrip": false,
|
| 152 |
+
"normalized": false,
|
| 153 |
+
"rstrip": false,
|
| 154 |
+
"single_word": false,
|
| 155 |
+
"special": false
|
| 156 |
+
},
|
| 157 |
+
"151662": {
|
| 158 |
+
"content": "<|fim_pad|>",
|
| 159 |
+
"lstrip": false,
|
| 160 |
+
"normalized": false,
|
| 161 |
+
"rstrip": false,
|
| 162 |
+
"single_word": false,
|
| 163 |
+
"special": false
|
| 164 |
+
},
|
| 165 |
+
"151663": {
|
| 166 |
+
"content": "<|repo_name|>",
|
| 167 |
+
"lstrip": false,
|
| 168 |
+
"normalized": false,
|
| 169 |
+
"rstrip": false,
|
| 170 |
+
"single_word": false,
|
| 171 |
+
"special": false
|
| 172 |
+
},
|
| 173 |
+
"151664": {
|
| 174 |
+
"content": "<|file_sep|>",
|
| 175 |
+
"lstrip": false,
|
| 176 |
+
"normalized": false,
|
| 177 |
+
"rstrip": false,
|
| 178 |
+
"single_word": false,
|
| 179 |
+
"special": false
|
| 180 |
+
}
|
| 181 |
+
},
|
| 182 |
+
"additional_special_tokens": [
|
| 183 |
+
"<|im_start|>",
|
| 184 |
+
"<|im_end|>",
|
| 185 |
+
"<|object_ref_start|>",
|
| 186 |
+
"<|object_ref_end|>",
|
| 187 |
+
"<|box_start|>",
|
| 188 |
+
"<|box_end|>",
|
| 189 |
+
"<|quad_start|>",
|
| 190 |
+
"<|quad_end|>",
|
| 191 |
+
"<|vision_start|>",
|
| 192 |
+
"<|vision_end|>",
|
| 193 |
+
"<|vision_pad|>",
|
| 194 |
+
"<|image_pad|>",
|
| 195 |
+
"<|video_pad|>"
|
| 196 |
+
],
|
| 197 |
+
"bos_token": null,
|
| 198 |
+
"clean_up_tokenization_spaces": false,
|
| 199 |
+
"eos_token": "<|im_end|>",
|
| 200 |
+
"errors": "replace",
|
| 201 |
+
"extra_special_tokens": {},
|
| 202 |
+
"model_max_length": 131072,
|
| 203 |
+
"pad_token": "<|endoftext|>",
|
| 204 |
+
"padding_side": "right",
|
| 205 |
+
"processor_class": "Qwen2_5_VLProcessor",
|
| 206 |
+
"split_special_tokens": false,
|
| 207 |
+
"tokenizer_class": "Qwen2Tokenizer",
|
| 208 |
+
"unk_token": null
|
| 209 |
+
}
|
EXP_1.2_3b/train_results.json
ADDED
|
@@ -0,0 +1,8 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"epoch": 3.0,
|
| 3 |
+
"total_flos": 56487017447424.0,
|
| 4 |
+
"train_loss": 0.33897173769248007,
|
| 5 |
+
"train_runtime": 796.6909,
|
| 6 |
+
"train_samples_per_second": 26.058,
|
| 7 |
+
"train_steps_per_second": 0.41
|
| 8 |
+
}
|
EXP_1.2_3b/trainer_log.jsonl
ADDED
|
@@ -0,0 +1,33 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{"current_steps": 10, "total_steps": 327, "loss": 0.8529, "lr": 2.7272727272727272e-06, "epoch": 0.09216589861751152, "percentage": 3.06, "elapsed_time": "0:00:29", "remaining_time": "0:15:30"}
|
| 2 |
+
{"current_steps": 20, "total_steps": 327, "loss": 0.6404, "lr": 5.7575757575757586e-06, "epoch": 0.18433179723502305, "percentage": 6.12, "elapsed_time": "0:00:52", "remaining_time": "0:13:28"}
|
| 3 |
+
{"current_steps": 30, "total_steps": 327, "loss": 0.5326, "lr": 8.787878787878788e-06, "epoch": 0.2764976958525346, "percentage": 9.17, "elapsed_time": "0:01:17", "remaining_time": "0:12:47"}
|
| 4 |
+
{"current_steps": 40, "total_steps": 327, "loss": 0.4925, "lr": 9.989726963751683e-06, "epoch": 0.3686635944700461, "percentage": 12.23, "elapsed_time": "0:01:40", "remaining_time": "0:11:58"}
|
| 5 |
+
{"current_steps": 50, "total_steps": 327, "loss": 0.4799, "lr": 9.927100106776213e-06, "epoch": 0.4608294930875576, "percentage": 15.29, "elapsed_time": "0:02:03", "remaining_time": "0:11:22"}
|
| 6 |
+
{"current_steps": 60, "total_steps": 327, "loss": 0.4592, "lr": 9.808267184205182e-06, "epoch": 0.5529953917050692, "percentage": 18.35, "elapsed_time": "0:02:26", "remaining_time": "0:10:51"}
|
| 7 |
+
{"current_steps": 70, "total_steps": 327, "loss": 0.455, "lr": 9.63458378673011e-06, "epoch": 0.6451612903225806, "percentage": 21.41, "elapsed_time": "0:02:50", "remaining_time": "0:10:26"}
|
| 8 |
+
{"current_steps": 80, "total_steps": 327, "loss": 0.4417, "lr": 9.408031213740045e-06, "epoch": 0.7373271889400922, "percentage": 24.46, "elapsed_time": "0:03:13", "remaining_time": "0:09:58"}
|
| 9 |
+
{"current_steps": 90, "total_steps": 327, "loss": 0.4463, "lr": 9.131193871579975e-06, "epoch": 0.8294930875576036, "percentage": 27.52, "elapsed_time": "0:03:36", "remaining_time": "0:09:29"}
|
| 10 |
+
{"current_steps": 100, "total_steps": 327, "loss": 0.4508, "lr": 8.807229791845673e-06, "epoch": 0.9216589861751152, "percentage": 30.58, "elapsed_time": "0:04:00", "remaining_time": "0:09:05"}
|
| 11 |
+
{"current_steps": 110, "total_steps": 327, "loss": 0.4024, "lr": 8.439834606028594e-06, "epoch": 1.0092165898617511, "percentage": 33.64, "elapsed_time": "0:04:23", "remaining_time": "0:08:39"}
|
| 12 |
+
{"current_steps": 120, "total_steps": 327, "loss": 0.3197, "lr": 8.033199387471278e-06, "epoch": 1.1013824884792627, "percentage": 36.7, "elapsed_time": "0:04:47", "remaining_time": "0:08:16"}
|
| 13 |
+
{"current_steps": 130, "total_steps": 327, "loss": 0.3219, "lr": 7.591962841552627e-06, "epoch": 1.1935483870967742, "percentage": 39.76, "elapsed_time": "0:05:11", "remaining_time": "0:07:51"}
|
| 14 |
+
{"current_steps": 140, "total_steps": 327, "loss": 0.3195, "lr": 7.121158389495187e-06, "epoch": 1.2857142857142856, "percentage": 42.81, "elapsed_time": "0:05:33", "remaining_time": "0:07:26"}
|
| 15 |
+
{"current_steps": 150, "total_steps": 327, "loss": 0.3175, "lr": 6.626156749437736e-06, "epoch": 1.3778801843317972, "percentage": 45.87, "elapsed_time": "0:05:57", "remaining_time": "0:07:01"}
|
| 16 |
+
{"current_steps": 160, "total_steps": 327, "loss": 0.3065, "lr": 6.112604669781572e-06, "epoch": 1.4700460829493087, "percentage": 48.93, "elapsed_time": "0:06:21", "remaining_time": "0:06:37"}
|
| 17 |
+
{"current_steps": 170, "total_steps": 327, "loss": 0.3157, "lr": 5.586360513712011e-06, "epoch": 1.5622119815668203, "percentage": 51.99, "elapsed_time": "0:06:43", "remaining_time": "0:06:12"}
|
| 18 |
+
{"current_steps": 180, "total_steps": 327, "loss": 0.311, "lr": 5.053427429716867e-06, "epoch": 1.6543778801843319, "percentage": 55.05, "elapsed_time": "0:07:06", "remaining_time": "0:05:48"}
|
| 19 |
+
{"current_steps": 190, "total_steps": 327, "loss": 0.3131, "lr": 4.5198848704615915e-06, "epoch": 1.7465437788018434, "percentage": 58.1, "elapsed_time": "0:07:28", "remaining_time": "0:05:23"}
|
| 20 |
+
{"current_steps": 200, "total_steps": 327, "loss": 0.2933, "lr": 3.991819241221836e-06, "epoch": 1.838709677419355, "percentage": 61.16, "elapsed_time": "0:07:52", "remaining_time": "0:05:00"}
|
| 21 |
+
{"current_steps": 210, "total_steps": 327, "loss": 0.2828, "lr": 3.475254469003865e-06, "epoch": 1.9308755760368663, "percentage": 64.22, "elapsed_time": "0:08:17", "remaining_time": "0:04:37"}
|
| 22 |
+
{"current_steps": 220, "total_steps": 327, "loss": 0.2656, "lr": 2.976083284388031e-06, "epoch": 2.0184331797235022, "percentage": 67.28, "elapsed_time": "0:08:41", "remaining_time": "0:04:13"}
|
| 23 |
+
{"current_steps": 230, "total_steps": 327, "loss": 0.1944, "lr": 2.5000000000000015e-06, "epoch": 2.110599078341014, "percentage": 70.34, "elapsed_time": "0:09:03", "remaining_time": "0:03:49"}
|
| 24 |
+
{"current_steps": 240, "total_steps": 327, "loss": 0.1971, "lr": 2.0524355524417017e-06, "epoch": 2.2027649769585254, "percentage": 73.39, "elapsed_time": "0:09:26", "remaining_time": "0:03:25"}
|
| 25 |
+
{"current_steps": 250, "total_steps": 327, "loss": 0.1971, "lr": 1.6384955486934157e-06, "epoch": 2.294930875576037, "percentage": 76.45, "elapsed_time": "0:09:50", "remaining_time": "0:03:01"}
|
| 26 |
+
{"current_steps": 260, "total_steps": 327, "loss": 0.1934, "lr": 1.2629020237248241e-06, "epoch": 2.3870967741935485, "percentage": 79.51, "elapsed_time": "0:10:15", "remaining_time": "0:02:38"}
|
| 27 |
+
{"current_steps": 270, "total_steps": 327, "loss": 0.1975, "lr": 9.299395737170758e-07, "epoch": 2.47926267281106, "percentage": 82.57, "elapsed_time": "0:10:39", "remaining_time": "0:02:14"}
|
| 28 |
+
{"current_steps": 280, "total_steps": 327, "loss": 0.2041, "lr": 6.43406479383053e-07, "epoch": 2.571428571428571, "percentage": 85.63, "elapsed_time": "0:11:01", "remaining_time": "0:01:51"}
|
| 29 |
+
{"current_steps": 290, "total_steps": 327, "loss": 0.1979, "lr": 4.0657137694820826e-07, "epoch": 2.6635944700460827, "percentage": 88.69, "elapsed_time": "0:11:26", "remaining_time": "0:01:27"}
|
| 30 |
+
{"current_steps": 300, "total_steps": 327, "loss": 0.1919, "lr": 2.2213597106929608e-07, "epoch": 2.7557603686635943, "percentage": 91.74, "elapsed_time": "0:11:48", "remaining_time": "0:01:03"}
|
| 31 |
+
{"current_steps": 310, "total_steps": 327, "loss": 0.1897, "lr": 9.22042150446728e-08, "epoch": 2.847926267281106, "percentage": 94.8, "elapsed_time": "0:12:12", "remaining_time": "0:00:40"}
|
| 32 |
+
{"current_steps": 320, "total_steps": 327, "loss": 0.1877, "lr": 1.8258309893965375e-08, "epoch": 2.9400921658986174, "percentage": 97.86, "elapsed_time": "0:12:37", "remaining_time": "0:00:16"}
|
| 33 |
+
{"current_steps": 327, "total_steps": 327, "epoch": 3.0, "percentage": 100.0, "elapsed_time": "0:13:14", "remaining_time": "0:00:00"}
|
EXP_1.2_3b/trainer_state.json
ADDED
|
@@ -0,0 +1,267 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"best_global_step": null,
|
| 3 |
+
"best_metric": null,
|
| 4 |
+
"best_model_checkpoint": null,
|
| 5 |
+
"epoch": 3.0,
|
| 6 |
+
"eval_steps": 500,
|
| 7 |
+
"global_step": 327,
|
| 8 |
+
"is_hyper_param_search": false,
|
| 9 |
+
"is_local_process_zero": true,
|
| 10 |
+
"is_world_process_zero": true,
|
| 11 |
+
"log_history": [
|
| 12 |
+
{
|
| 13 |
+
"epoch": 0.09216589861751152,
|
| 14 |
+
"grad_norm": 4.912416428745171,
|
| 15 |
+
"learning_rate": 2.7272727272727272e-06,
|
| 16 |
+
"loss": 0.8529,
|
| 17 |
+
"step": 10
|
| 18 |
+
},
|
| 19 |
+
{
|
| 20 |
+
"epoch": 0.18433179723502305,
|
| 21 |
+
"grad_norm": 2.1443951476400227,
|
| 22 |
+
"learning_rate": 5.7575757575757586e-06,
|
| 23 |
+
"loss": 0.6404,
|
| 24 |
+
"step": 20
|
| 25 |
+
},
|
| 26 |
+
{
|
| 27 |
+
"epoch": 0.2764976958525346,
|
| 28 |
+
"grad_norm": 1.658722389064836,
|
| 29 |
+
"learning_rate": 8.787878787878788e-06,
|
| 30 |
+
"loss": 0.5326,
|
| 31 |
+
"step": 30
|
| 32 |
+
},
|
| 33 |
+
{
|
| 34 |
+
"epoch": 0.3686635944700461,
|
| 35 |
+
"grad_norm": 1.5618873833827294,
|
| 36 |
+
"learning_rate": 9.989726963751683e-06,
|
| 37 |
+
"loss": 0.4925,
|
| 38 |
+
"step": 40
|
| 39 |
+
},
|
| 40 |
+
{
|
| 41 |
+
"epoch": 0.4608294930875576,
|
| 42 |
+
"grad_norm": 1.4925009785754748,
|
| 43 |
+
"learning_rate": 9.927100106776213e-06,
|
| 44 |
+
"loss": 0.4799,
|
| 45 |
+
"step": 50
|
| 46 |
+
},
|
| 47 |
+
{
|
| 48 |
+
"epoch": 0.5529953917050692,
|
| 49 |
+
"grad_norm": 1.322188289534146,
|
| 50 |
+
"learning_rate": 9.808267184205182e-06,
|
| 51 |
+
"loss": 0.4592,
|
| 52 |
+
"step": 60
|
| 53 |
+
},
|
| 54 |
+
{
|
| 55 |
+
"epoch": 0.6451612903225806,
|
| 56 |
+
"grad_norm": 1.4524544851906915,
|
| 57 |
+
"learning_rate": 9.63458378673011e-06,
|
| 58 |
+
"loss": 0.455,
|
| 59 |
+
"step": 70
|
| 60 |
+
},
|
| 61 |
+
{
|
| 62 |
+
"epoch": 0.7373271889400922,
|
| 63 |
+
"grad_norm": 1.400758206854473,
|
| 64 |
+
"learning_rate": 9.408031213740045e-06,
|
| 65 |
+
"loss": 0.4417,
|
| 66 |
+
"step": 80
|
| 67 |
+
},
|
| 68 |
+
{
|
| 69 |
+
"epoch": 0.8294930875576036,
|
| 70 |
+
"grad_norm": 1.393729749690361,
|
| 71 |
+
"learning_rate": 9.131193871579975e-06,
|
| 72 |
+
"loss": 0.4463,
|
| 73 |
+
"step": 90
|
| 74 |
+
},
|
| 75 |
+
{
|
| 76 |
+
"epoch": 0.9216589861751152,
|
| 77 |
+
"grad_norm": 1.4880361350108735,
|
| 78 |
+
"learning_rate": 8.807229791845673e-06,
|
| 79 |
+
"loss": 0.4508,
|
| 80 |
+
"step": 100
|
| 81 |
+
},
|
| 82 |
+
{
|
| 83 |
+
"epoch": 1.0092165898617511,
|
| 84 |
+
"grad_norm": 1.299774055064554,
|
| 85 |
+
"learning_rate": 8.439834606028594e-06,
|
| 86 |
+
"loss": 0.4024,
|
| 87 |
+
"step": 110
|
| 88 |
+
},
|
| 89 |
+
{
|
| 90 |
+
"epoch": 1.1013824884792627,
|
| 91 |
+
"grad_norm": 1.298198957499482,
|
| 92 |
+
"learning_rate": 8.033199387471278e-06,
|
| 93 |
+
"loss": 0.3197,
|
| 94 |
+
"step": 120
|
| 95 |
+
},
|
| 96 |
+
{
|
| 97 |
+
"epoch": 1.1935483870967742,
|
| 98 |
+
"grad_norm": 1.3650000791809225,
|
| 99 |
+
"learning_rate": 7.591962841552627e-06,
|
| 100 |
+
"loss": 0.3219,
|
| 101 |
+
"step": 130
|
| 102 |
+
},
|
| 103 |
+
{
|
| 104 |
+
"epoch": 1.2857142857142856,
|
| 105 |
+
"grad_norm": 1.434265135129886,
|
| 106 |
+
"learning_rate": 7.121158389495187e-06,
|
| 107 |
+
"loss": 0.3195,
|
| 108 |
+
"step": 140
|
| 109 |
+
},
|
| 110 |
+
{
|
| 111 |
+
"epoch": 1.3778801843317972,
|
| 112 |
+
"grad_norm": 1.495992136082361,
|
| 113 |
+
"learning_rate": 6.626156749437736e-06,
|
| 114 |
+
"loss": 0.3175,
|
| 115 |
+
"step": 150
|
| 116 |
+
},
|
| 117 |
+
{
|
| 118 |
+
"epoch": 1.4700460829493087,
|
| 119 |
+
"grad_norm": 1.3523477092551237,
|
| 120 |
+
"learning_rate": 6.112604669781572e-06,
|
| 121 |
+
"loss": 0.3065,
|
| 122 |
+
"step": 160
|
| 123 |
+
},
|
| 124 |
+
{
|
| 125 |
+
"epoch": 1.5622119815668203,
|
| 126 |
+
"grad_norm": 1.2695226148335443,
|
| 127 |
+
"learning_rate": 5.586360513712011e-06,
|
| 128 |
+
"loss": 0.3157,
|
| 129 |
+
"step": 170
|
| 130 |
+
},
|
| 131 |
+
{
|
| 132 |
+
"epoch": 1.6543778801843319,
|
| 133 |
+
"grad_norm": 1.2984094624484053,
|
| 134 |
+
"learning_rate": 5.053427429716867e-06,
|
| 135 |
+
"loss": 0.311,
|
| 136 |
+
"step": 180
|
| 137 |
+
},
|
| 138 |
+
{
|
| 139 |
+
"epoch": 1.7465437788018434,
|
| 140 |
+
"grad_norm": 1.465758872280537,
|
| 141 |
+
"learning_rate": 4.5198848704615915e-06,
|
| 142 |
+
"loss": 0.3131,
|
| 143 |
+
"step": 190
|
| 144 |
+
},
|
| 145 |
+
{
|
| 146 |
+
"epoch": 1.838709677419355,
|
| 147 |
+
"grad_norm": 1.3727910421793548,
|
| 148 |
+
"learning_rate": 3.991819241221836e-06,
|
| 149 |
+
"loss": 0.2933,
|
| 150 |
+
"step": 200
|
| 151 |
+
},
|
| 152 |
+
{
|
| 153 |
+
"epoch": 1.9308755760368663,
|
| 154 |
+
"grad_norm": 1.2600662931978257,
|
| 155 |
+
"learning_rate": 3.475254469003865e-06,
|
| 156 |
+
"loss": 0.2828,
|
| 157 |
+
"step": 210
|
| 158 |
+
},
|
| 159 |
+
{
|
| 160 |
+
"epoch": 2.0184331797235022,
|
| 161 |
+
"grad_norm": 1.2847409994029446,
|
| 162 |
+
"learning_rate": 2.976083284388031e-06,
|
| 163 |
+
"loss": 0.2656,
|
| 164 |
+
"step": 220
|
| 165 |
+
},
|
| 166 |
+
{
|
| 167 |
+
"epoch": 2.110599078341014,
|
| 168 |
+
"grad_norm": 1.4825208018701794,
|
| 169 |
+
"learning_rate": 2.5000000000000015e-06,
|
| 170 |
+
"loss": 0.1944,
|
| 171 |
+
"step": 230
|
| 172 |
+
},
|
| 173 |
+
{
|
| 174 |
+
"epoch": 2.2027649769585254,
|
| 175 |
+
"grad_norm": 1.4346771852112512,
|
| 176 |
+
"learning_rate": 2.0524355524417017e-06,
|
| 177 |
+
"loss": 0.1971,
|
| 178 |
+
"step": 240
|
| 179 |
+
},
|
| 180 |
+
{
|
| 181 |
+
"epoch": 2.294930875576037,
|
| 182 |
+
"grad_norm": 1.2773781048743025,
|
| 183 |
+
"learning_rate": 1.6384955486934157e-06,
|
| 184 |
+
"loss": 0.1971,
|
| 185 |
+
"step": 250
|
| 186 |
+
},
|
| 187 |
+
{
|
| 188 |
+
"epoch": 2.3870967741935485,
|
| 189 |
+
"grad_norm": 1.3513953567558703,
|
| 190 |
+
"learning_rate": 1.2629020237248241e-06,
|
| 191 |
+
"loss": 0.1934,
|
| 192 |
+
"step": 260
|
| 193 |
+
},
|
| 194 |
+
{
|
| 195 |
+
"epoch": 2.47926267281106,
|
| 196 |
+
"grad_norm": 1.1153786023075227,
|
| 197 |
+
"learning_rate": 9.299395737170758e-07,
|
| 198 |
+
"loss": 0.1975,
|
| 199 |
+
"step": 270
|
| 200 |
+
},
|
| 201 |
+
{
|
| 202 |
+
"epoch": 2.571428571428571,
|
| 203 |
+
"grad_norm": 1.255516210757092,
|
| 204 |
+
"learning_rate": 6.43406479383053e-07,
|
| 205 |
+
"loss": 0.2041,
|
| 206 |
+
"step": 280
|
| 207 |
+
},
|
| 208 |
+
{
|
| 209 |
+
"epoch": 2.6635944700460827,
|
| 210 |
+
"grad_norm": 1.2757448362037527,
|
| 211 |
+
"learning_rate": 4.0657137694820826e-07,
|
| 212 |
+
"loss": 0.1979,
|
| 213 |
+
"step": 290
|
| 214 |
+
},
|
| 215 |
+
{
|
| 216 |
+
"epoch": 2.7557603686635943,
|
| 217 |
+
"grad_norm": 1.218519400711122,
|
| 218 |
+
"learning_rate": 2.2213597106929608e-07,
|
| 219 |
+
"loss": 0.1919,
|
| 220 |
+
"step": 300
|
| 221 |
+
},
|
| 222 |
+
{
|
| 223 |
+
"epoch": 2.847926267281106,
|
| 224 |
+
"grad_norm": 1.2928947540930629,
|
| 225 |
+
"learning_rate": 9.22042150446728e-08,
|
| 226 |
+
"loss": 0.1897,
|
| 227 |
+
"step": 310
|
| 228 |
+
},
|
| 229 |
+
{
|
| 230 |
+
"epoch": 2.9400921658986174,
|
| 231 |
+
"grad_norm": 1.2273178341773785,
|
| 232 |
+
"learning_rate": 1.8258309893965375e-08,
|
| 233 |
+
"loss": 0.1877,
|
| 234 |
+
"step": 320
|
| 235 |
+
},
|
| 236 |
+
{
|
| 237 |
+
"epoch": 3.0,
|
| 238 |
+
"step": 327,
|
| 239 |
+
"total_flos": 56487017447424.0,
|
| 240 |
+
"train_loss": 0.33897173769248007,
|
| 241 |
+
"train_runtime": 796.6909,
|
| 242 |
+
"train_samples_per_second": 26.058,
|
| 243 |
+
"train_steps_per_second": 0.41
|
| 244 |
+
}
|
| 245 |
+
],
|
| 246 |
+
"logging_steps": 10,
|
| 247 |
+
"max_steps": 327,
|
| 248 |
+
"num_input_tokens_seen": 0,
|
| 249 |
+
"num_train_epochs": 3,
|
| 250 |
+
"save_steps": 1000,
|
| 251 |
+
"stateful_callbacks": {
|
| 252 |
+
"TrainerControl": {
|
| 253 |
+
"args": {
|
| 254 |
+
"should_epoch_stop": false,
|
| 255 |
+
"should_evaluate": false,
|
| 256 |
+
"should_log": false,
|
| 257 |
+
"should_save": true,
|
| 258 |
+
"should_training_stop": true
|
| 259 |
+
},
|
| 260 |
+
"attributes": {}
|
| 261 |
+
}
|
| 262 |
+
},
|
| 263 |
+
"total_flos": 56487017447424.0,
|
| 264 |
+
"train_batch_size": 4,
|
| 265 |
+
"trial_name": null,
|
| 266 |
+
"trial_params": null
|
| 267 |
+
}
|
EXP_1.2_3b/training_args.bin
ADDED
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:f71a34ce7185ab0118e74c9a95e848341684e12450857a06282b8ccf3a277b4d
|
| 3 |
+
size 7953
|
EXP_1.2_3b/training_loss.png
ADDED
|
EXP_1.2_3b/video_preprocessor_config.json
ADDED
|
@@ -0,0 +1,86 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"_valid_kwargs_names": [
|
| 3 |
+
"do_convert_rgb",
|
| 4 |
+
"do_resize",
|
| 5 |
+
"size",
|
| 6 |
+
"size_divisor",
|
| 7 |
+
"default_to_square",
|
| 8 |
+
"resample",
|
| 9 |
+
"do_rescale",
|
| 10 |
+
"rescale_factor",
|
| 11 |
+
"do_normalize",
|
| 12 |
+
"image_mean",
|
| 13 |
+
"image_std",
|
| 14 |
+
"do_pad",
|
| 15 |
+
"do_center_crop",
|
| 16 |
+
"crop_size",
|
| 17 |
+
"data_format",
|
| 18 |
+
"input_data_format",
|
| 19 |
+
"device",
|
| 20 |
+
"min_pixels",
|
| 21 |
+
"max_pixels",
|
| 22 |
+
"patch_size",
|
| 23 |
+
"temporal_patch_size",
|
| 24 |
+
"merge_size"
|
| 25 |
+
],
|
| 26 |
+
"crop_size": null,
|
| 27 |
+
"data_format": "channels_first",
|
| 28 |
+
"default_to_square": true,
|
| 29 |
+
"device": null,
|
| 30 |
+
"do_center_crop": null,
|
| 31 |
+
"do_convert_rgb": true,
|
| 32 |
+
"do_normalize": true,
|
| 33 |
+
"do_pad": null,
|
| 34 |
+
"do_rescale": true,
|
| 35 |
+
"do_resize": true,
|
| 36 |
+
"image_mean": [
|
| 37 |
+
0.48145466,
|
| 38 |
+
0.4578275,
|
| 39 |
+
0.40821073
|
| 40 |
+
],
|
| 41 |
+
"image_processor_type": "Qwen2VLImageProcessor",
|
| 42 |
+
"image_std": [
|
| 43 |
+
0.26862954,
|
| 44 |
+
0.26130258,
|
| 45 |
+
0.27577711
|
| 46 |
+
],
|
| 47 |
+
"input_data_format": null,
|
| 48 |
+
"max_pixels": 12845056,
|
| 49 |
+
"merge_size": 2,
|
| 50 |
+
"min_pixels": 3136,
|
| 51 |
+
"model_valid_processing_keys": [
|
| 52 |
+
"do_convert_rgb",
|
| 53 |
+
"do_resize",
|
| 54 |
+
"size",
|
| 55 |
+
"size_divisor",
|
| 56 |
+
"default_to_square",
|
| 57 |
+
"resample",
|
| 58 |
+
"do_rescale",
|
| 59 |
+
"rescale_factor",
|
| 60 |
+
"do_normalize",
|
| 61 |
+
"image_mean",
|
| 62 |
+
"image_std",
|
| 63 |
+
"do_pad",
|
| 64 |
+
"do_center_crop",
|
| 65 |
+
"crop_size",
|
| 66 |
+
"data_format",
|
| 67 |
+
"input_data_format",
|
| 68 |
+
"device",
|
| 69 |
+
"min_pixels",
|
| 70 |
+
"max_pixels",
|
| 71 |
+
"patch_size",
|
| 72 |
+
"temporal_patch_size",
|
| 73 |
+
"merge_size"
|
| 74 |
+
],
|
| 75 |
+
"patch_size": 14,
|
| 76 |
+
"processor_class": "Qwen2_5_VLProcessor",
|
| 77 |
+
"resample": 3,
|
| 78 |
+
"rescale_factor": 0.00392156862745098,
|
| 79 |
+
"size": {
|
| 80 |
+
"longest_edge": 12845056,
|
| 81 |
+
"shortest_edge": 3136
|
| 82 |
+
},
|
| 83 |
+
"size_divisor": null,
|
| 84 |
+
"temporal_patch_size": 2,
|
| 85 |
+
"video_processor_type": "Qwen2VLVideoProcessor"
|
| 86 |
+
}
|
EXP_1.2_3b/vocab.json
ADDED
|
The diff for this file is too large to render.
See raw diff
|
|
|
EXP_2.1_3b/README.md
ADDED
|
@@ -0,0 +1,69 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
---
|
| 2 |
+
library_name: transformers
|
| 3 |
+
license: other
|
| 4 |
+
base_model: /mnt/nvme/hyz/hf/models/Qwen2.5-VL-3B-Instruct
|
| 5 |
+
tags:
|
| 6 |
+
- llama-factory
|
| 7 |
+
- full
|
| 8 |
+
- generated_from_trainer
|
| 9 |
+
model-index:
|
| 10 |
+
- name: EXP_2.1_3b
|
| 11 |
+
results: []
|
| 12 |
+
---
|
| 13 |
+
|
| 14 |
+
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
| 15 |
+
should probably proofread and complete it, then remove this comment. -->
|
| 16 |
+
|
| 17 |
+
# EXP_2.1_3b
|
| 18 |
+
|
| 19 |
+
This model is a fine-tuned version of [/mnt/nvme/hyz/hf/models/Qwen2.5-VL-3B-Instruct](https://huggingface.co//mnt/nvme/hyz/hf/models/Qwen2.5-VL-3B-Instruct) on the txt_img dataset.
|
| 20 |
+
It achieves the following results on the evaluation set:
|
| 21 |
+
- Loss: 0.5101
|
| 22 |
+
|
| 23 |
+
## Model description
|
| 24 |
+
|
| 25 |
+
More information needed
|
| 26 |
+
|
| 27 |
+
## Intended uses & limitations
|
| 28 |
+
|
| 29 |
+
More information needed
|
| 30 |
+
|
| 31 |
+
## Training and evaluation data
|
| 32 |
+
|
| 33 |
+
More information needed
|
| 34 |
+
|
| 35 |
+
## Training procedure
|
| 36 |
+
|
| 37 |
+
### Training hyperparameters
|
| 38 |
+
|
| 39 |
+
The following hyperparameters were used during training:
|
| 40 |
+
- learning_rate: 1e-05
|
| 41 |
+
- train_batch_size: 4
|
| 42 |
+
- eval_batch_size: 1
|
| 43 |
+
- seed: 42
|
| 44 |
+
- distributed_type: multi-GPU
|
| 45 |
+
- num_devices: 8
|
| 46 |
+
- gradient_accumulation_steps: 2
|
| 47 |
+
- total_train_batch_size: 64
|
| 48 |
+
- total_eval_batch_size: 8
|
| 49 |
+
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
|
| 50 |
+
- lr_scheduler_type: cosine
|
| 51 |
+
- lr_scheduler_warmup_ratio: 0.1
|
| 52 |
+
- num_epochs: 3.0
|
| 53 |
+
|
| 54 |
+
### Training results
|
| 55 |
+
|
| 56 |
+
| Training Loss | Epoch | Step | Validation Loss |
|
| 57 |
+
|:-------------:|:------:|:----:|:---------------:|
|
| 58 |
+
| 0.5338 | 0.7107 | 500 | 0.5341 |
|
| 59 |
+
| 0.4883 | 1.4208 | 1000 | 0.5142 |
|
| 60 |
+
| 0.4397 | 2.1322 | 1500 | 0.5145 |
|
| 61 |
+
| 0.436 | 2.8429 | 2000 | 0.5102 |
|
| 62 |
+
|
| 63 |
+
|
| 64 |
+
### Framework versions
|
| 65 |
+
|
| 66 |
+
- Transformers 4.52.4
|
| 67 |
+
- Pytorch 2.7.1+cu126
|
| 68 |
+
- Datasets 3.6.0
|
| 69 |
+
- Tokenizers 0.21.1
|
EXP_2.1_3b/added_tokens.json
ADDED
|
@@ -0,0 +1,24 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"</tool_call>": 151658,
|
| 3 |
+
"<tool_call>": 151657,
|
| 4 |
+
"<|box_end|>": 151649,
|
| 5 |
+
"<|box_start|>": 151648,
|
| 6 |
+
"<|endoftext|>": 151643,
|
| 7 |
+
"<|file_sep|>": 151664,
|
| 8 |
+
"<|fim_middle|>": 151660,
|
| 9 |
+
"<|fim_pad|>": 151662,
|
| 10 |
+
"<|fim_prefix|>": 151659,
|
| 11 |
+
"<|fim_suffix|>": 151661,
|
| 12 |
+
"<|im_end|>": 151645,
|
| 13 |
+
"<|im_start|>": 151644,
|
| 14 |
+
"<|image_pad|>": 151655,
|
| 15 |
+
"<|object_ref_end|>": 151647,
|
| 16 |
+
"<|object_ref_start|>": 151646,
|
| 17 |
+
"<|quad_end|>": 151651,
|
| 18 |
+
"<|quad_start|>": 151650,
|
| 19 |
+
"<|repo_name|>": 151663,
|
| 20 |
+
"<|video_pad|>": 151656,
|
| 21 |
+
"<|vision_end|>": 151653,
|
| 22 |
+
"<|vision_pad|>": 151654,
|
| 23 |
+
"<|vision_start|>": 151652
|
| 24 |
+
}
|