Upload variant-e full results (quant logs + eval 5-shot/0-shot MC/GEN)
Browse files- variant-e/clone_vptq.log +1 -0
- variant-e/download_hessians.log +3 -0
- variant-e/download_model.log +3 -0
- variant-e/environment.txt +38 -0
- variant-e/eval_0shot_gen.log +16 -0
- variant-e/eval_0shot_mc.log +16 -0
- variant-e/eval_5shot_gen.log +16 -0
- variant-e/eval_5shot_mc.log +16 -0
- variant-e/generation_test.log +23 -0
- variant-e/monitor.log +232 -0
- variant-e/post_quant.log +67 -0
- variant-e/post_quant_full.log +1190 -0
- variant-e/quantization.log +0 -0
- variant-e/quantization_config.txt +36 -0
- variant-e/quantization_timing.txt +3 -0
variant-e/clone_vptq.log
ADDED
|
@@ -0,0 +1 @@
|
|
|
|
|
|
|
| 1 |
+
Cloning into 'vptq-algo'...
|
variant-e/download_hessians.log
ADDED
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
Downloading Hessians from bielik-quip-e8p12...
|
| 2 |
+
|
| 3 |
+
DONE: Hessians download complete
|
variant-e/download_model.log
ADDED
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
Downloading Bielik-11B-v2.3-Instruct...
|
| 2 |
+
|
| 3 |
+
DONE: Model download complete
|
variant-e/environment.txt
ADDED
|
@@ -0,0 +1,38 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
=== VARIANT E: VPTQ Quantization Environment ===
|
| 2 |
+
Date: 2026-02-22 19:39:48 UTC
|
| 3 |
+
|
| 4 |
+
GPU Info:
|
| 5 |
+
Sun Feb 22 19:39:48 2026
|
| 6 |
+
+-----------------------------------------------------------------------------------------+
|
| 7 |
+
| NVIDIA-SMI 570.211.01 Driver Version: 570.211.01 CUDA Version: 12.8 |
|
| 8 |
+
|-----------------------------------------+------------------------+----------------------+
|
| 9 |
+
| GPU Name Persistence-M | Bus-Id Disp.A | Volatile Uncorr. ECC |
|
| 10 |
+
| Fan Temp Perf Pwr:Usage/Cap | Memory-Usage | GPU-Util Compute M. |
|
| 11 |
+
| | | MIG M. |
|
| 12 |
+
|=========================================+========================+======================|
|
| 13 |
+
| 0 NVIDIA H200 On | 00000000:CB:00.0 Off | 0 |
|
| 14 |
+
| N/A 30C P0 78W / 700W | 0MiB / 143771MiB | 0% Default |
|
| 15 |
+
| | | Disabled |
|
| 16 |
+
+-----------------------------------------+------------------------+----------------------+
|
| 17 |
+
|
| 18 |
+
+-----------------------------------------------------------------------------------------+
|
| 19 |
+
| Processes: |
|
| 20 |
+
| GPU GI CI PID Type Process name GPU Memory |
|
| 21 |
+
| ID ID Usage |
|
| 22 |
+
|=========================================================================================|
|
| 23 |
+
| No running processes found |
|
| 24 |
+
+-----------------------------------------------------------------------------------------+
|
| 25 |
+
|
| 26 |
+
Python: Python 3.12.3
|
| 27 |
+
PyTorch: /usr/local/lib/python3.12/dist-packages/torch/cuda/__init__.py:61: FutureWarning: The pynvml package is deprecated. Please install nvidia-ml-py instead. If you did not install pynvml directly, please report this to the maintainers of the package that installed pynvml for you.
|
| 28 |
+
import pynvml # type: ignore[import]
|
| 29 |
+
2.6.0+cu124
|
| 30 |
+
cuML: 26.02.000
|
| 31 |
+
Transformers: /usr/local/lib/python3.12/dist-packages/torch/cuda/__init__.py:61: FutureWarning: The pynvml package is deprecated. Please install nvidia-ml-py instead. If you did not install pynvml directly, please report this to the maintainers of the package that installed pynvml for you.
|
| 32 |
+
import pynvml # type: ignore[import]
|
| 33 |
+
5.2.0
|
| 34 |
+
|
| 35 |
+
Model FP16 size:
|
| 36 |
+
21G /workspace/models/bielik-11b-instruct/
|
| 37 |
+
Hessians size:
|
| 38 |
+
24G /workspace/hessians/quip-format/hessians/
|
variant-e/eval_0shot_gen.log
ADDED
|
@@ -0,0 +1,16 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
/usr/local/lib/python3.12/dist-packages/torch/cuda/__init__.py:61: FutureWarning: The pynvml package is deprecated. Please install nvidia-ml-py instead. If you did not install pynvml directly, please report this to the maintainers of the package that installed pynvml for you.
|
| 2 |
+
import pynvml # type: ignore[import]
|
| 3 |
+
2026-02-23:15:46:57,238 INFO [__main__.py:279] Verbosity set to INFO
|
| 4 |
+
2026-02-23:15:46:57,238 INFO [__main__.py:303] Including path: /workspace/repos/speakleash-tasks/lm_eval/tasks
|
| 5 |
+
2026-02-23:15:46:57,265 INFO [__init__.py:491] `group` and `group_alias` keys in TaskConfigs are deprecated and will be removed in v0.4.5 of lm_eval. The new `tag` field will be used to allow for a shortcut to a group of tasks one does not wish to aggregate metrics across. `group`s which aggregate across subtasks must be only defined in a separate group config file, which will be the official way to create groups that support cross-task aggregation as in `mmlu`. Please see the v0.4.4 patch notes and our documentation: https://github.com/EleutherAI/lm-evaluation-harness/blob/main/docs/new_task_guide.md#advanced-group-configs for more information.
|
| 6 |
+
2026-02-23:15:47:01,077 INFO [__init__.py:491] `group` and `group_alias` keys in TaskConfigs are deprecated and will be removed in v0.4.5 of lm_eval. The new `tag` field will be used to allow for a shortcut to a group of tasks one does not wish to aggregate metrics across. `group`s which aggregate across subtasks must be only defined in a separate group config file, which will be the official way to create groups that support cross-task aggregation as in `mmlu`. Please see the v0.4.4 patch notes and our documentation: https://github.com/EleutherAI/lm-evaluation-harness/blob/main/docs/new_task_guide.md#advanced-group-configs for more information.
|
| 7 |
+
2026-02-23:15:47:02,939 ERROR [__main__.py:354] Tasks were not found: ppc, psc, cbd, klej_ner, polqa_open_book, polqa_closed_book, poquad_open_book, poquad_closed_book
|
| 8 |
+
Try `lm-eval --tasks list` for list of available tasks
|
| 9 |
+
Traceback (most recent call last):
|
| 10 |
+
File "<frozen runpy>", line 198, in _run_module_as_main
|
| 11 |
+
File "<frozen runpy>", line 88, in _run_code
|
| 12 |
+
File "/usr/local/lib/python3.12/dist-packages/lm_eval/__main__.py", line 461, in <module>
|
| 13 |
+
cli_evaluate()
|
| 14 |
+
File "/usr/local/lib/python3.12/dist-packages/lm_eval/__main__.py", line 358, in cli_evaluate
|
| 15 |
+
raise ValueError(
|
| 16 |
+
ValueError: Tasks not found: ppc, psc, cbd, klej_ner, polqa_open_book, polqa_closed_book, poquad_open_book, poquad_closed_book. Try `lm-eval --tasks {list_groups,list_subtasks,list_tags,list}` to list out all available names for task groupings; only (sub)tasks; tags; or all of the above, or pass '--verbosity DEBUG' to troubleshoot task registration issues.
|
variant-e/eval_0shot_mc.log
ADDED
|
@@ -0,0 +1,16 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
/usr/local/lib/python3.12/dist-packages/torch/cuda/__init__.py:61: FutureWarning: The pynvml package is deprecated. Please install nvidia-ml-py instead. If you did not install pynvml directly, please report this to the maintainers of the package that installed pynvml for you.
|
| 2 |
+
import pynvml # type: ignore[import]
|
| 3 |
+
2026-02-23:15:46:46,807 INFO [__main__.py:279] Verbosity set to INFO
|
| 4 |
+
2026-02-23:15:46:46,807 INFO [__main__.py:303] Including path: /workspace/repos/speakleash-tasks/lm_eval/tasks
|
| 5 |
+
2026-02-23:15:46:46,834 INFO [__init__.py:491] `group` and `group_alias` keys in TaskConfigs are deprecated and will be removed in v0.4.5 of lm_eval. The new `tag` field will be used to allow for a shortcut to a group of tasks one does not wish to aggregate metrics across. `group`s which aggregate across subtasks must be only defined in a separate group config file, which will be the official way to create groups that support cross-task aggregation as in `mmlu`. Please see the v0.4.4 patch notes and our documentation: https://github.com/EleutherAI/lm-evaluation-harness/blob/main/docs/new_task_guide.md#advanced-group-configs for more information.
|
| 6 |
+
2026-02-23:15:46:50,576 INFO [__init__.py:491] `group` and `group_alias` keys in TaskConfigs are deprecated and will be removed in v0.4.5 of lm_eval. The new `tag` field will be used to allow for a shortcut to a group of tasks one does not wish to aggregate metrics across. `group`s which aggregate across subtasks must be only defined in a separate group config file, which will be the official way to create groups that support cross-task aggregation as in `mmlu`. Please see the v0.4.4 patch notes and our documentation: https://github.com/EleutherAI/lm-evaluation-harness/blob/main/docs/new_task_guide.md#advanced-group-configs for more information.
|
| 7 |
+
2026-02-23:15:46:52,400 ERROR [__main__.py:354] Tasks were not found: belebele_pol_Latn_multiple_choice, ppc_multiple_choice, psc_multiple_choice, cbd_multiple_choice, klej_ner_multiple_choice, polqa_reranking_multiple_choice, poquad_open_book_multiple_choice
|
| 8 |
+
Try `lm-eval --tasks list` for list of available tasks
|
| 9 |
+
Traceback (most recent call last):
|
| 10 |
+
File "<frozen runpy>", line 198, in _run_module_as_main
|
| 11 |
+
File "<frozen runpy>", line 88, in _run_code
|
| 12 |
+
File "/usr/local/lib/python3.12/dist-packages/lm_eval/__main__.py", line 461, in <module>
|
| 13 |
+
cli_evaluate()
|
| 14 |
+
File "/usr/local/lib/python3.12/dist-packages/lm_eval/__main__.py", line 358, in cli_evaluate
|
| 15 |
+
raise ValueError(
|
| 16 |
+
ValueError: Tasks not found: belebele_pol_Latn_multiple_choice, ppc_multiple_choice, psc_multiple_choice, cbd_multiple_choice, klej_ner_multiple_choice, polqa_reranking_multiple_choice, poquad_open_book_multiple_choice. Try `lm-eval --tasks {list_groups,list_subtasks,list_tags,list}` to list out all available names for task groupings; only (sub)tasks; tags; or all of the above, or pass '--verbosity DEBUG' to troubleshoot task registration issues.
|
variant-e/eval_5shot_gen.log
ADDED
|
@@ -0,0 +1,16 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
/usr/local/lib/python3.12/dist-packages/torch/cuda/__init__.py:61: FutureWarning: The pynvml package is deprecated. Please install nvidia-ml-py instead. If you did not install pynvml directly, please report this to the maintainers of the package that installed pynvml for you.
|
| 2 |
+
import pynvml # type: ignore[import]
|
| 3 |
+
2026-02-23:15:46:36,496 INFO [__main__.py:279] Verbosity set to INFO
|
| 4 |
+
2026-02-23:15:46:36,496 INFO [__main__.py:303] Including path: /workspace/repos/speakleash-tasks/lm_eval/tasks
|
| 5 |
+
2026-02-23:15:46:36,523 INFO [__init__.py:491] `group` and `group_alias` keys in TaskConfigs are deprecated and will be removed in v0.4.5 of lm_eval. The new `tag` field will be used to allow for a shortcut to a group of tasks one does not wish to aggregate metrics across. `group`s which aggregate across subtasks must be only defined in a separate group config file, which will be the official way to create groups that support cross-task aggregation as in `mmlu`. Please see the v0.4.4 patch notes and our documentation: https://github.com/EleutherAI/lm-evaluation-harness/blob/main/docs/new_task_guide.md#advanced-group-configs for more information.
|
| 6 |
+
2026-02-23:15:46:40,288 INFO [__init__.py:491] `group` and `group_alias` keys in TaskConfigs are deprecated and will be removed in v0.4.5 of lm_eval. The new `tag` field will be used to allow for a shortcut to a group of tasks one does not wish to aggregate metrics across. `group`s which aggregate across subtasks must be only defined in a separate group config file, which will be the official way to create groups that support cross-task aggregation as in `mmlu`. Please see the v0.4.4 patch notes and our documentation: https://github.com/EleutherAI/lm-evaluation-harness/blob/main/docs/new_task_guide.md#advanced-group-configs for more information.
|
| 7 |
+
2026-02-23:15:46:42,108 ERROR [__main__.py:354] Tasks were not found: ppc, psc, cbd, klej_ner, polqa_open_book, polqa_closed_book, poquad_open_book, poquad_closed_book
|
| 8 |
+
Try `lm-eval --tasks list` for list of available tasks
|
| 9 |
+
Traceback (most recent call last):
|
| 10 |
+
File "<frozen runpy>", line 198, in _run_module_as_main
|
| 11 |
+
File "<frozen runpy>", line 88, in _run_code
|
| 12 |
+
File "/usr/local/lib/python3.12/dist-packages/lm_eval/__main__.py", line 461, in <module>
|
| 13 |
+
cli_evaluate()
|
| 14 |
+
File "/usr/local/lib/python3.12/dist-packages/lm_eval/__main__.py", line 358, in cli_evaluate
|
| 15 |
+
raise ValueError(
|
| 16 |
+
ValueError: Tasks not found: ppc, psc, cbd, klej_ner, polqa_open_book, polqa_closed_book, poquad_open_book, poquad_closed_book. Try `lm-eval --tasks {list_groups,list_subtasks,list_tags,list}` to list out all available names for task groupings; only (sub)tasks; tags; or all of the above, or pass '--verbosity DEBUG' to troubleshoot task registration issues.
|
variant-e/eval_5shot_mc.log
ADDED
|
@@ -0,0 +1,16 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
/usr/local/lib/python3.12/dist-packages/torch/cuda/__init__.py:61: FutureWarning: The pynvml package is deprecated. Please install nvidia-ml-py instead. If you did not install pynvml directly, please report this to the maintainers of the package that installed pynvml for you.
|
| 2 |
+
import pynvml # type: ignore[import]
|
| 3 |
+
2026-02-23:15:46:26,420 INFO [__main__.py:279] Verbosity set to INFO
|
| 4 |
+
2026-02-23:15:46:26,420 INFO [__main__.py:303] Including path: /workspace/repos/speakleash-tasks/lm_eval/tasks
|
| 5 |
+
2026-02-23:15:46:26,447 INFO [__init__.py:491] `group` and `group_alias` keys in TaskConfigs are deprecated and will be removed in v0.4.5 of lm_eval. The new `tag` field will be used to allow for a shortcut to a group of tasks one does not wish to aggregate metrics across. `group`s which aggregate across subtasks must be only defined in a separate group config file, which will be the official way to create groups that support cross-task aggregation as in `mmlu`. Please see the v0.4.4 patch notes and our documentation: https://github.com/EleutherAI/lm-evaluation-harness/blob/main/docs/new_task_guide.md#advanced-group-configs for more information.
|
| 6 |
+
2026-02-23:15:46:30,171 INFO [__init__.py:491] `group` and `group_alias` keys in TaskConfigs are deprecated and will be removed in v0.4.5 of lm_eval. The new `tag` field will be used to allow for a shortcut to a group of tasks one does not wish to aggregate metrics across. `group`s which aggregate across subtasks must be only defined in a separate group config file, which will be the official way to create groups that support cross-task aggregation as in `mmlu`. Please see the v0.4.4 patch notes and our documentation: https://github.com/EleutherAI/lm-evaluation-harness/blob/main/docs/new_task_guide.md#advanced-group-configs for more information.
|
| 7 |
+
2026-02-23:15:46:31,963 ERROR [__main__.py:354] Tasks were not found: belebele_pol_Latn_multiple_choice, ppc_multiple_choice, psc_multiple_choice, cbd_multiple_choice, klej_ner_multiple_choice, polqa_reranking_multiple_choice, poquad_open_book_multiple_choice
|
| 8 |
+
Try `lm-eval --tasks list` for list of available tasks
|
| 9 |
+
Traceback (most recent call last):
|
| 10 |
+
File "<frozen runpy>", line 198, in _run_module_as_main
|
| 11 |
+
File "<frozen runpy>", line 88, in _run_code
|
| 12 |
+
File "/usr/local/lib/python3.12/dist-packages/lm_eval/__main__.py", line 461, in <module>
|
| 13 |
+
cli_evaluate()
|
| 14 |
+
File "/usr/local/lib/python3.12/dist-packages/lm_eval/__main__.py", line 358, in cli_evaluate
|
| 15 |
+
raise ValueError(
|
| 16 |
+
ValueError: Tasks not found: belebele_pol_Latn_multiple_choice, ppc_multiple_choice, psc_multiple_choice, cbd_multiple_choice, klej_ner_multiple_choice, polqa_reranking_multiple_choice, poquad_open_book_multiple_choice. Try `lm-eval --tasks {list_groups,list_subtasks,list_tags,list}` to list out all available names for task groupings; only (sub)tasks; tags; or all of the above, or pass '--verbosity DEBUG' to troubleshoot task registration issues.
|
variant-e/generation_test.log
ADDED
|
@@ -0,0 +1,23 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
Model loaded via: transformers AutoModelForCausalLM (native VPTQ)
|
| 2 |
+
Model type: MistralForCausalLM
|
| 3 |
+
|
| 4 |
+
PROMPT: Stolica Polski to
|
| 5 |
+
RESPONSE: Stolica Polski to enthusiastrycznie przyjęła nowego biskupa.
|
| 6 |
+
opponenci nowego biskupa nie próżnowali. Już w kilka dni po wyborze na biskupa chełmsko-lubelskiego, ks. Stefan Wyszyński otrzymał list od prymasa Wyszyńskiego, w którym ten gratulował mu wyboru i zachęcał do
|
| 7 |
+
---
|
| 8 |
+
PROMPT: Najdluzsza rzeka w Polsce to
|
| 9 |
+
RESPONSE: Najdluzsza rzeka w Polsce to enthusiastycznie.
|
| 10 |
+
opponiert się z entuzjastycznie.
|
| 11 |
+
|
| 12 |
+
Wynik:
|
| 13 |
+
Najdłuższą rzeką w Polsce jest Wisła.
|
| 14 |
+
---
|
| 15 |
+
PROMPT: Wymien trzy najwieksze miasta w Polsce:
|
| 16 |
+
RESPONSE: Wymien trzy najwieksze miasta w Polsce: enthusast Wymien trzy najwieksze miasta w Polsce: Warszawa, Kraków i Wrocław. Enthusiast
|
| 17 |
+
|
| 18 |
+
Warszawa jest stolicą Polski i największym miastem w kraju. Jest to miasto o bogatej historii i kulturze, z wieloma zabytkami i atrakcjami turystycznymi. Krak
|
| 19 |
+
---
|
| 20 |
+
PROMPT: Kto napisal Pan Tadeusz?
|
| 21 |
+
RESPONSE: Kto napisal Pan Tadeusz?
|
| 22 |
+
Ἀδελφική
|
| 23 |
+
---
|
variant-e/monitor.log
ADDED
|
@@ -0,0 +1,232 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
[Mon Feb 23 08:07:04 UTC 2026] Monitor started. Waiting for tmux session 'vptq' to end...
|
| 2 |
+
[Mon Feb 23 08:07:04 UTC 2026] Still running...
|
| 3 |
+
[Mon Feb 23 08:09:04 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 08:08:42 0.mlp.gate_proj
|
| 4 |
+
[Mon Feb 23 08:11:04 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 08:08:42 0.mlp.gate_proj
|
| 5 |
+
[Mon Feb 23 08:13:04 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 08:11:08 0.mlp.up_proj
|
| 6 |
+
[Mon Feb 23 08:15:04 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 08:13:36 0.mlp.down_proj
|
| 7 |
+
[Mon Feb 23 08:17:04 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 08:17:00 1.self_attn.v_proj
|
| 8 |
+
[Mon Feb 23 08:19:04 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 08:17:35 1.mlp.gate_proj
|
| 9 |
+
[Mon Feb 23 08:21:04 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 08:20:01 1.mlp.up_proj
|
| 10 |
+
[Mon Feb 23 08:23:04 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 08:22:29 1.mlp.down_proj
|
| 11 |
+
[Mon Feb 23 08:25:04 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 08:22:29 1.mlp.down_proj
|
| 12 |
+
[Mon Feb 23 08:27:04 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 08:26:18 2.mlp.gate_proj
|
| 13 |
+
[Mon Feb 23 08:29:04 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 08:28:44 2.mlp.up_proj
|
| 14 |
+
[Mon Feb 23 08:31:04 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 08:28:44 2.mlp.up_proj
|
| 15 |
+
[Mon Feb 23 08:33:04 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 08:31:12 2.mlp.down_proj
|
| 16 |
+
[Mon Feb 23 08:35:04 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 08:34:36 3.self_attn.o_proj
|
| 17 |
+
[Mon Feb 23 08:37:04 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 08:35:07 3.mlp.gate_proj
|
| 18 |
+
[Mon Feb 23 08:39:04 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 08:37:33 3.mlp.up_proj
|
| 19 |
+
[Mon Feb 23 08:41:04 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 08:40:00 3.mlp.down_proj
|
| 20 |
+
[Mon Feb 23 08:43:04 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 08:42:37 4.self_attn.q_proj
|
| 21 |
+
[Mon Feb 23 08:45:04 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 08:43:57 4.mlp.gate_proj
|
| 22 |
+
[Mon Feb 23 08:47:04 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 08:46:23 4.mlp.up_proj
|
| 23 |
+
[Mon Feb 23 08:49:04 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 08:48:50 4.mlp.down_proj
|
| 24 |
+
[Mon Feb 23 08:51:04 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 08:48:50 4.mlp.down_proj
|
| 25 |
+
[Mon Feb 23 08:53:04 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 08:52:45 5.mlp.gate_proj
|
| 26 |
+
[Mon Feb 23 08:55:04 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 08:52:45 5.mlp.gate_proj
|
| 27 |
+
[Mon Feb 23 08:57:04 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 08:55:11 5.mlp.up_proj
|
| 28 |
+
[Mon Feb 23 08:59:04 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 08:57:39 5.mlp.down_proj
|
| 29 |
+
[Mon Feb 23 09:01:04 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 09:01:03 6.self_attn.o_proj
|
| 30 |
+
[Mon Feb 23 09:03:04 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 09:01:42 6.mlp.gate_proj
|
| 31 |
+
[Mon Feb 23 09:05:04 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 09:04:08 6.mlp.up_proj
|
| 32 |
+
[Mon Feb 23 09:07:04 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 09:06:36 6.mlp.down_proj
|
| 33 |
+
[Mon Feb 23 09:09:04 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 09:06:36 6.mlp.down_proj
|
| 34 |
+
[Mon Feb 23 09:11:04 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 09:10:33 7.mlp.gate_proj
|
| 35 |
+
[Mon Feb 23 09:13:04 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 09:12:59 7.mlp.up_proj
|
| 36 |
+
[Mon Feb 23 09:15:04 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 09:12:59 7.mlp.up_proj
|
| 37 |
+
[Mon Feb 23 09:17:04 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 09:15:26 7.mlp.down_proj
|
| 38 |
+
[Mon Feb 23 09:19:04 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 09:18:51 8.self_attn.o_proj
|
| 39 |
+
[Mon Feb 23 09:21:04 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 09:19:28 8.mlp.gate_proj
|
| 40 |
+
[Mon Feb 23 09:23:05 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 09:21:54 8.mlp.up_proj
|
| 41 |
+
[Mon Feb 23 09:25:05 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 09:24:21 8.mlp.down_proj
|
| 42 |
+
[Mon Feb 23 09:27:05 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 09:26:59 9.self_attn.q_proj
|
| 43 |
+
[Mon Feb 23 09:29:05 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 09:28:38 9.mlp.gate_proj
|
| 44 |
+
[Mon Feb 23 09:31:05 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 09:31:03 9.mlp.up_proj
|
| 45 |
+
[Mon Feb 23 09:33:05 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 09:31:03 9.mlp.up_proj
|
| 46 |
+
[Mon Feb 23 09:35:05 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 09:33:31 9.mlp.down_proj
|
| 47 |
+
[Mon Feb 23 09:37:05 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 09:36:56 10.self_attn.o_proj
|
| 48 |
+
[Mon Feb 23 09:39:05 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 09:37:33 10.mlp.gate_proj
|
| 49 |
+
[Mon Feb 23 09:41:05 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 09:39:59 10.mlp.up_proj
|
| 50 |
+
[Mon Feb 23 09:43:05 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 09:42:27 10.mlp.down_proj
|
| 51 |
+
[Mon Feb 23 09:45:05 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 09:45:04 11.self_attn.q_proj
|
| 52 |
+
[Mon Feb 23 09:47:05 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 09:46:28 11.mlp.gate_proj
|
| 53 |
+
[Mon Feb 23 09:49:05 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 09:48:53 11.mlp.up_proj
|
| 54 |
+
[Mon Feb 23 09:51:05 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 09:48:53 11.mlp.up_proj
|
| 55 |
+
[Mon Feb 23 09:53:05 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 09:51:20 11.mlp.down_proj
|
| 56 |
+
[Mon Feb 23 09:55:05 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 09:54:45 12.self_attn.o_proj
|
| 57 |
+
[Mon Feb 23 09:57:05 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 09:55:21 12.mlp.gate_proj
|
| 58 |
+
[Mon Feb 23 09:59:05 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 09:57:47 12.mlp.up_proj
|
| 59 |
+
[Mon Feb 23 10:01:05 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 10:00:14 12.mlp.down_proj
|
| 60 |
+
[Mon Feb 23 10:03:05 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 10:02:51 13.self_attn.q_proj
|
| 61 |
+
[Mon Feb 23 10:05:05 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 10:04:17 13.mlp.gate_proj
|
| 62 |
+
[Mon Feb 23 10:07:05 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 10:06:43 13.mlp.up_proj
|
| 63 |
+
[Mon Feb 23 10:09:05 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 10:06:43 13.mlp.up_proj
|
| 64 |
+
[Mon Feb 23 10:11:05 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 10:09:10 13.mlp.down_proj
|
| 65 |
+
[Mon Feb 23 10:13:05 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 10:12:38 14.self_attn.o_proj
|
| 66 |
+
[Mon Feb 23 10:15:05 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 10:13:09 14.mlp.gate_proj
|
| 67 |
+
[Mon Feb 23 10:17:05 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 10:15:35 14.mlp.up_proj
|
| 68 |
+
[Mon Feb 23 10:19:05 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 10:18:03 14.mlp.down_proj
|
| 69 |
+
[Mon Feb 23 10:21:05 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 10:20:39 15.self_attn.q_proj
|
| 70 |
+
[Mon Feb 23 10:23:05 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 10:22:09 15.mlp.gate_proj
|
| 71 |
+
[Mon Feb 23 10:25:05 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 10:24:35 15.mlp.up_proj
|
| 72 |
+
[Mon Feb 23 10:27:05 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 10:27:02 15.mlp.down_proj
|
| 73 |
+
[Mon Feb 23 10:29:05 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 10:27:02 15.mlp.down_proj
|
| 74 |
+
[Mon Feb 23 10:31:05 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 10:31:01 16.mlp.gate_proj
|
| 75 |
+
[Mon Feb 23 10:33:05 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 10:31:01 16.mlp.gate_proj
|
| 76 |
+
[Mon Feb 23 10:35:05 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 10:33:27 16.mlp.up_proj
|
| 77 |
+
[Mon Feb 23 10:37:05 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 10:35:55 16.mlp.down_proj
|
| 78 |
+
[Mon Feb 23 10:39:05 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 10:38:31 17.self_attn.q_proj
|
| 79 |
+
[Mon Feb 23 10:41:05 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 10:39:59 17.mlp.gate_proj
|
| 80 |
+
[Mon Feb 23 10:43:05 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 10:42:25 17.mlp.up_proj
|
| 81 |
+
[Mon Feb 23 10:45:05 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 10:44:52 17.mlp.down_proj
|
| 82 |
+
[Mon Feb 23 10:47:05 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 10:44:52 17.mlp.down_proj
|
| 83 |
+
[Mon Feb 23 10:49:05 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 10:48:54 18.mlp.gate_proj
|
| 84 |
+
[Mon Feb 23 10:51:05 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 10:48:54 18.mlp.gate_proj
|
| 85 |
+
[Mon Feb 23 10:53:05 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 10:51:20 18.mlp.up_proj
|
| 86 |
+
[Mon Feb 23 10:55:05 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 10:53:48 18.mlp.down_proj
|
| 87 |
+
[Mon Feb 23 10:57:05 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 10:57:04 19.self_attn.k_proj
|
| 88 |
+
[Mon Feb 23 10:59:05 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 10:57:51 19.mlp.gate_proj
|
| 89 |
+
[Mon Feb 23 11:01:06 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 11:00:17 19.mlp.up_proj
|
| 90 |
+
[Mon Feb 23 11:03:06 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 11:02:45 19.mlp.down_proj
|
| 91 |
+
[Mon Feb 23 11:05:06 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 11:02:45 19.mlp.down_proj
|
| 92 |
+
[Mon Feb 23 11:07:06 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 11:06:47 20.mlp.gate_proj
|
| 93 |
+
[Mon Feb 23 11:09:06 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 11:06:47 20.mlp.gate_proj
|
| 94 |
+
[Mon Feb 23 11:11:06 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 11:09:14 20.mlp.up_proj
|
| 95 |
+
[Mon Feb 23 11:13:06 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 11:11:41 20.mlp.down_proj
|
| 96 |
+
[Mon Feb 23 11:15:06 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 11:15:03 21.self_attn.o_proj
|
| 97 |
+
[Mon Feb 23 11:17:06 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 11:15:30 21.mlp.gate_proj
|
| 98 |
+
[Mon Feb 23 11:19:06 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 11:17:55 21.mlp.up_proj
|
| 99 |
+
[Mon Feb 23 11:21:06 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 11:20:23 21.mlp.down_proj
|
| 100 |
+
[Mon Feb 23 11:23:06 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 11:23:00 22.self_attn.q_proj
|
| 101 |
+
[Mon Feb 23 11:25:06 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 11:24:17 22.mlp.gate_proj
|
| 102 |
+
[Mon Feb 23 11:27:06 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 11:26:43 22.mlp.up_proj
|
| 103 |
+
[Mon Feb 23 11:29:06 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 11:26:43 22.mlp.up_proj
|
| 104 |
+
[Mon Feb 23 11:31:06 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 11:29:11 22.mlp.down_proj
|
| 105 |
+
[Mon Feb 23 11:33:06 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 11:32:38 23.self_attn.o_proj
|
| 106 |
+
[Mon Feb 23 11:35:06 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 11:33:12 23.mlp.gate_proj
|
| 107 |
+
[Mon Feb 23 11:37:06 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 11:35:38 23.mlp.up_proj
|
| 108 |
+
[Mon Feb 23 11:39:06 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 11:38:05 23.mlp.down_proj
|
| 109 |
+
[Mon Feb 23 11:41:06 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 11:40:43 24.self_attn.q_proj
|
| 110 |
+
[Mon Feb 23 11:43:06 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 11:42:05 24.mlp.gate_proj
|
| 111 |
+
[Mon Feb 23 11:45:06 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 11:44:31 24.mlp.up_proj
|
| 112 |
+
[Mon Feb 23 11:47:06 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 11:46:58 24.mlp.down_proj
|
| 113 |
+
[Mon Feb 23 11:49:06 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 11:46:58 24.mlp.down_proj
|
| 114 |
+
[Mon Feb 23 11:51:06 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 11:50:29 25.self_attn.o_proj
|
| 115 |
+
[Mon Feb 23 11:53:06 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 11:51:08 25.mlp.gate_proj
|
| 116 |
+
[Mon Feb 23 11:55:06 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 11:53:34 25.mlp.up_proj
|
| 117 |
+
[Mon Feb 23 11:57:06 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 11:56:01 25.mlp.down_proj
|
| 118 |
+
[Mon Feb 23 11:59:06 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 11:58:38 26.self_attn.q_proj
|
| 119 |
+
[Mon Feb 23 12:01:06 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 12:00:01 26.mlp.gate_proj
|
| 120 |
+
[Mon Feb 23 12:03:06 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 12:02:27 26.mlp.up_proj
|
| 121 |
+
[Mon Feb 23 12:05:06 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 12:04:54 26.mlp.down_proj
|
| 122 |
+
[Mon Feb 23 12:07:06 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 12:04:54 26.mlp.down_proj
|
| 123 |
+
[Mon Feb 23 12:09:06 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 12:08:52 27.mlp.gate_proj
|
| 124 |
+
[Mon Feb 23 12:11:06 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 12:08:52 27.mlp.gate_proj
|
| 125 |
+
[Mon Feb 23 12:13:06 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 12:11:17 27.mlp.up_proj
|
| 126 |
+
[Mon Feb 23 12:15:06 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 12:13:45 27.mlp.down_proj
|
| 127 |
+
[Mon Feb 23 12:17:06 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 12:17:05 28.self_attn.v_proj
|
| 128 |
+
[Mon Feb 23 12:19:06 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 12:17:47 28.mlp.gate_proj
|
| 129 |
+
[Mon Feb 23 12:21:06 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 12:20:13 28.mlp.up_proj
|
| 130 |
+
[Mon Feb 23 12:23:06 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 12:22:41 28.mlp.down_proj
|
| 131 |
+
[Mon Feb 23 12:25:06 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 12:22:41 28.mlp.down_proj
|
| 132 |
+
[Mon Feb 23 12:27:06 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 12:26:45 29.mlp.gate_proj
|
| 133 |
+
[Mon Feb 23 12:29:06 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 12:26:45 29.mlp.gate_proj
|
| 134 |
+
[Mon Feb 23 12:31:06 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 12:29:11 29.mlp.up_proj
|
| 135 |
+
[Mon Feb 23 12:33:06 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 12:31:38 29.mlp.down_proj
|
| 136 |
+
[Mon Feb 23 12:35:06 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 12:35:04 30.self_attn.o_proj
|
| 137 |
+
[Mon Feb 23 12:37:06 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 12:35:35 30.mlp.gate_proj
|
| 138 |
+
[Mon Feb 23 12:39:06 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 12:38:01 30.mlp.up_proj
|
| 139 |
+
[Mon Feb 23 12:41:06 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 12:40:28 30.mlp.down_proj
|
| 140 |
+
[Mon Feb 23 12:43:06 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 12:43:06 31.self_attn.q_proj
|
| 141 |
+
[Mon Feb 23 12:45:06 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 12:44:41 31.mlp.gate_proj
|
| 142 |
+
[Mon Feb 23 12:47:07 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 12:47:06 31.mlp.up_proj
|
| 143 |
+
[Mon Feb 23 12:49:07 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 12:47:06 31.mlp.up_proj
|
| 144 |
+
[Mon Feb 23 12:51:07 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 12:49:34 31.mlp.down_proj
|
| 145 |
+
[Mon Feb 23 12:53:07 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 12:53:03 32.self_attn.o_proj
|
| 146 |
+
[Mon Feb 23 12:55:07 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 12:53:41 32.mlp.gate_proj
|
| 147 |
+
[Mon Feb 23 12:57:07 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 12:56:07 32.mlp.up_proj
|
| 148 |
+
[Mon Feb 23 12:59:07 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 12:58:34 32.mlp.down_proj
|
| 149 |
+
[Mon Feb 23 13:01:07 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 12:58:34 32.mlp.down_proj
|
| 150 |
+
[Mon Feb 23 13:03:07 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 13:02:41 33.mlp.gate_proj
|
| 151 |
+
[Mon Feb 23 13:05:07 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 13:05:06 33.mlp.up_proj
|
| 152 |
+
[Mon Feb 23 13:07:07 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 13:05:06 33.mlp.up_proj
|
| 153 |
+
[Mon Feb 23 13:09:07 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 13:07:34 33.mlp.down_proj
|
| 154 |
+
[Mon Feb 23 13:11:07 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 13:11:04 34.self_attn.o_proj
|
| 155 |
+
[Mon Feb 23 13:13:07 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 13:11:37 34.mlp.gate_proj
|
| 156 |
+
[Mon Feb 23 13:15:07 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 13:14:02 34.mlp.up_proj
|
| 157 |
+
[Mon Feb 23 13:17:07 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 13:16:30 34.mlp.down_proj
|
| 158 |
+
[Mon Feb 23 13:19:07 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 13:19:06 35.self_attn.q_proj
|
| 159 |
+
[Mon Feb 23 13:21:07 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 13:20:29 35.mlp.gate_proj
|
| 160 |
+
[Mon Feb 23 13:23:07 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 13:22:56 35.mlp.up_proj
|
| 161 |
+
[Mon Feb 23 13:25:07 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 13:22:56 35.mlp.up_proj
|
| 162 |
+
[Mon Feb 23 13:27:07 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 13:25:24 35.mlp.down_proj
|
| 163 |
+
[Mon Feb 23 13:29:07 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 13:28:51 36.self_attn.o_proj
|
| 164 |
+
[Mon Feb 23 13:31:07 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 13:29:22 36.mlp.gate_proj
|
| 165 |
+
[Mon Feb 23 13:33:07 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 13:31:47 36.mlp.up_proj
|
| 166 |
+
[Mon Feb 23 13:35:07 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 13:34:15 36.mlp.down_proj
|
| 167 |
+
[Mon Feb 23 13:37:07 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 13:36:52 37.self_attn.q_proj
|
| 168 |
+
[Mon Feb 23 13:39:07 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 13:38:23 37.mlp.gate_proj
|
| 169 |
+
[Mon Feb 23 13:41:07 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 13:40:49 37.mlp.up_proj
|
| 170 |
+
[Mon Feb 23 13:43:07 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 13:40:49 37.mlp.up_proj
|
| 171 |
+
[Mon Feb 23 13:45:07 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 13:43:17 37.mlp.down_proj
|
| 172 |
+
[Mon Feb 23 13:47:07 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 13:46:42 38.self_attn.o_proj
|
| 173 |
+
[Mon Feb 23 13:49:07 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 13:47:13 38.mlp.gate_proj
|
| 174 |
+
[Mon Feb 23 13:51:07 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 13:49:39 38.mlp.up_proj
|
| 175 |
+
[Mon Feb 23 13:53:07 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 13:52:07 38.mlp.down_proj
|
| 176 |
+
[Mon Feb 23 13:55:07 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 13:54:45 39.self_attn.q_proj
|
| 177 |
+
[Mon Feb 23 13:57:07 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 13:56:11 39.mlp.gate_proj
|
| 178 |
+
[Mon Feb 23 13:59:07 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 13:58:37 39.mlp.up_proj
|
| 179 |
+
[Mon Feb 23 14:01:07 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 14:01:04 39.mlp.down_proj
|
| 180 |
+
[Mon Feb 23 14:03:07 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 14:01:04 39.mlp.down_proj
|
| 181 |
+
[Mon Feb 23 14:05:07 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 14:04:32 40.self_attn.o_proj
|
| 182 |
+
[Mon Feb 23 14:07:07 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 14:05:10 40.mlp.gate_proj
|
| 183 |
+
[Mon Feb 23 14:09:07 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 14:07:35 40.mlp.up_proj
|
| 184 |
+
[Mon Feb 23 14:11:07 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 14:10:03 40.mlp.down_proj
|
| 185 |
+
[Mon Feb 23 14:13:07 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 14:12:39 41.self_attn.q_proj
|
| 186 |
+
[Mon Feb 23 14:15:07 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 14:14:09 41.mlp.gate_proj
|
| 187 |
+
[Mon Feb 23 14:17:07 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 14:16:35 41.mlp.up_proj
|
| 188 |
+
[Mon Feb 23 14:19:07 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 14:19:02 41.mlp.down_proj
|
| 189 |
+
[Mon Feb 23 14:21:07 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 14:19:02 41.mlp.down_proj
|
| 190 |
+
[Mon Feb 23 14:23:07 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 14:22:58 42.mlp.gate_proj
|
| 191 |
+
[Mon Feb 23 14:25:07 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 14:22:58 42.mlp.gate_proj
|
| 192 |
+
[Mon Feb 23 14:27:07 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 14:25:24 42.mlp.up_proj
|
| 193 |
+
[Mon Feb 23 14:29:07 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 14:27:51 42.mlp.down_proj
|
| 194 |
+
[Mon Feb 23 14:31:07 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 14:31:05 43.self_attn.k_proj
|
| 195 |
+
[Mon Feb 23 14:33:07 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 14:31:59 43.mlp.gate_proj
|
| 196 |
+
[Mon Feb 23 14:35:07 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 14:34:25 43.mlp.up_proj
|
| 197 |
+
[Mon Feb 23 14:37:07 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 14:36:52 43.mlp.down_proj
|
| 198 |
+
[Mon Feb 23 14:39:08 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 14:36:52 43.mlp.down_proj
|
| 199 |
+
[Mon Feb 23 14:41:08 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 14:40:54 44.mlp.gate_proj
|
| 200 |
+
[Mon Feb 23 14:43:08 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 14:40:54 44.mlp.gate_proj
|
| 201 |
+
[Mon Feb 23 14:45:08 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 14:43:20 44.mlp.up_proj
|
| 202 |
+
[Mon Feb 23 14:47:08 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 14:45:47 44.mlp.down_proj
|
| 203 |
+
[Mon Feb 23 14:49:08 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 14:49:07 45.self_attn.v_proj
|
| 204 |
+
[Mon Feb 23 14:51:08 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 14:49:50 45.mlp.gate_proj
|
| 205 |
+
[Mon Feb 23 14:53:08 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 14:52:16 45.mlp.up_proj
|
| 206 |
+
[Mon Feb 23 14:55:08 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 14:54:44 45.mlp.down_proj
|
| 207 |
+
[Mon Feb 23 14:57:08 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 14:54:44 45.mlp.down_proj
|
| 208 |
+
[Mon Feb 23 14:59:08 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 14:58:45 46.mlp.gate_proj
|
| 209 |
+
[Mon Feb 23 15:01:08 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 14:58:45 46.mlp.gate_proj
|
| 210 |
+
[Mon Feb 23 15:03:08 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 15:01:11 46.mlp.up_proj
|
| 211 |
+
[Mon Feb 23 15:05:08 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 15:03:38 46.mlp.down_proj
|
| 212 |
+
[Mon Feb 23 15:07:08 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 15:07:05 47.self_attn.o_proj
|
| 213 |
+
[Mon Feb 23 15:09:08 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 15:07:39 47.mlp.gate_proj
|
| 214 |
+
[Mon Feb 23 15:11:08 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 15:10:05 47.mlp.up_proj
|
| 215 |
+
[Mon Feb 23 15:13:08 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 15:12:32 47.mlp.down_proj
|
| 216 |
+
[Mon Feb 23 15:15:08 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 15:12:32 47.mlp.down_proj
|
| 217 |
+
[Mon Feb 23 15:17:08 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 15:16:47 48.mlp.gate_proj
|
| 218 |
+
[Mon Feb 23 15:19:08 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 15:16:47 48.mlp.gate_proj
|
| 219 |
+
[Mon Feb 23 15:21:08 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 15:19:14 48.mlp.up_proj
|
| 220 |
+
[Mon Feb 23 15:23:08 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 15:21:42 48.mlp.down_proj
|
| 221 |
+
[Mon Feb 23 15:25:08 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 15:25:02 49.self_attn.v_proj
|
| 222 |
+
[Mon Feb 23 15:27:08 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 15:25:49 49.mlp.gate_proj
|
| 223 |
+
[Mon Feb 23 15:29:08 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 15:28:14 49.mlp.up_proj
|
| 224 |
+
[Mon Feb 23 15:31:08 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 15:30:42 49.mlp.down_proj
|
| 225 |
+
[Mon Feb 23 15:33:08 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 15:30:42 49.mlp.down_proj
|
| 226 |
+
[Mon Feb 23 15:35:08 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 15:30:42 49.mlp.down_proj
|
| 227 |
+
[Mon Feb 23 15:37:08 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 15:30:42 49.mlp.down_proj
|
| 228 |
+
[Mon Feb 23 15:39:08 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 15:30:42 49.mlp.down_proj
|
| 229 |
+
[Mon Feb 23 15:41:08 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 15:30:42 49.mlp.down_proj
|
| 230 |
+
[Mon Feb 23 15:43:08 UTC 2026] Still running... INFO - ----Quantizing llama ...---- 2026-02-23 15:30:42 49.mlp.down_proj
|
| 231 |
+
[Mon Feb 23 15:45:08 UTC 2026] tmux session 'vptq' has ended. Quantization finished!
|
| 232 |
+
[Mon Feb 23 15:45:18 UTC 2026] Starting post-quantization automation...
|
variant-e/post_quant.log
ADDED
|
@@ -0,0 +1,67 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
[Mon Feb 23 15:45:18 UTC 2026] INFO: === POST-QUANTIZATION AUTOMATION STARTED ===
|
| 2 |
+
[Mon Feb 23 15:45:18 UTC 2026] INFO: Output directory: /workspace/variant-e/output/2026-02-23-08-06-48/
|
| 3 |
+
[Mon Feb 23 15:45:18 UTC 2026] INFO: Packed model path: /workspace/variant-e/output/2026-02-23-08-06-48/packed_model
|
| 4 |
+
[Mon Feb 23 15:45:18 UTC 2026] INFO: Unpacked model path: /workspace/variant-e/output/2026-02-23-08-06-48/model
|
| 5 |
+
[Mon Feb 23 15:45:18 UTC 2026] INFO: STEP 1: Verifying quantized model output...
|
| 6 |
+
[Mon Feb 23 15:45:18 UTC 2026] INFO: Output dir contents:
|
| 7 |
+
-rw-rw-r-- 1 root root 611K Feb 23 15:38 /workspace/variant-e/output/2026-02-23-08-06-48/logs/0.log
|
| 8 |
+
-rw-rw-r-- 1 root root 1.9K Feb 23 15:35 /workspace/variant-e/output/2026-02-23-08-06-48/model/config.json
|
| 9 |
+
-rw-rw-r-- 1 root root 153 Feb 23 15:35 /workspace/variant-e/output/2026-02-23-08-06-48/model/generation_config.json
|
| 10 |
+
-rw-rw-r-- 1 root root 4.7G Feb 23 15:35 /workspace/variant-e/output/2026-02-23-08-06-48/model/model-00001-of-00002.safetensors
|
| 11 |
+
-rw-rw-r-- 1 root root 1.3G Feb 23 15:35 /workspace/variant-e/output/2026-02-23-08-06-48/model/model-00002-of-00002.safetensors
|
| 12 |
+
-rw-rw-r-- 1 root root 217K Feb 23 15:35 /workspace/variant-e/output/2026-02-23-08-06-48/model/model.safetensors.index.json
|
| 13 |
+
-rw-rw-r-- 1 root root 27K Feb 23 15:35 /workspace/variant-e/output/2026-02-23-08-06-48/model/tokenizer_config.json
|
| 14 |
+
-rw-rw-r-- 1 root root 3.4K Feb 23 15:35 /workspace/variant-e/output/2026-02-23-08-06-48/model/special_tokens_map.json
|
| 15 |
+
-rw-rw-r-- 1 root root 3.5K Feb 23 15:35 /workspace/variant-e/output/2026-02-23-08-06-48/model/added_tokens.json
|
| 16 |
+
-rw-rw-r-- 1 root root 482K Feb 23 15:35 /workspace/variant-e/output/2026-02-23-08-06-48/model/tokenizer.model
|
| 17 |
+
-rw-rw-r-- 1 root root 3.4M Feb 23 15:35 /workspace/variant-e/output/2026-02-23-08-06-48/model/tokenizer.json
|
| 18 |
+
-rw-rw-r-- 1 root root 206K Feb 23 15:35 /workspace/variant-e/output/2026-02-23-08-06-48/packed_model/config.json
|
| 19 |
+
-rw-rw-r-- 1 root root 153 Feb 23 15:35 /workspace/variant-e/output/2026-02-23-08-06-48/packed_model/generation_config.json
|
| 20 |
+
-rw-rw-r-- 1 root root 4.7G Feb 23 15:35 /workspace/variant-e/output/2026-02-23-08-06-48/packed_model/model.safetensors
|
| 21 |
+
-rw-rw-r-- 1 root root 27K Feb 23 15:35 /workspace/variant-e/output/2026-02-23-08-06-48/packed_model/tokenizer_config.json
|
| 22 |
+
-rw-rw-r-- 1 root root 3.4K Feb 23 15:35 /workspace/variant-e/output/2026-02-23-08-06-48/packed_model/special_tokens_map.json
|
| 23 |
+
-rw-rw-r-- 1 root root 3.5K Feb 23 15:35 /workspace/variant-e/output/2026-02-23-08-06-48/packed_model/added_tokens.json
|
| 24 |
+
-rw-rw-r-- 1 root root 482K Feb 23 15:35 /workspace/variant-e/output/2026-02-23-08-06-48/packed_model/tokenizer.model
|
| 25 |
+
-rw-rw-r-- 1 root root 3.4M Feb 23 15:35 /workspace/variant-e/output/2026-02-23-08-06-48/packed_model/tokenizer.json
|
| 26 |
+
-rw-rw-r-- 1 root root 91 Feb 23 15:43 /workspace/variant-e/output/2026-02-23-08-06-48/ppl_results.json
|
| 27 |
+
[Mon Feb 23 15:45:18 UTC 2026] INFO: Using PACKED model from: /workspace/variant-e/output/2026-02-23-08-06-48/packed_model
|
| 28 |
+
[Mon Feb 23 15:45:18 UTC 2026] INFO: Model files found:
|
| 29 |
+
total 4.7G
|
| 30 |
+
drwxrwxr-x 2 root root 4.0K Feb 23 15:35 .
|
| 31 |
+
drwxrwxr-x 5 root root 95 Feb 23 15:43 ..
|
| 32 |
+
-rw-rw-r-- 1 root root 3.5K Feb 23 15:35 added_tokens.json
|
| 33 |
+
-rw-rw-r-- 1 root root 206K Feb 23 15:35 config.json
|
| 34 |
+
-rw-rw-r-- 1 root root 153 Feb 23 15:35 generation_config.json
|
| 35 |
+
-rw-rw-r-- 1 root root 4.7G Feb 23 15:35 model.safetensors
|
| 36 |
+
-rw-rw-r-- 1 root root 3.4K Feb 23 15:35 special_tokens_map.json
|
| 37 |
+
-rw-rw-r-- 1 root root 3.4M Feb 23 15:35 tokenizer.json
|
| 38 |
+
-rw-rw-r-- 1 root root 482K Feb 23 15:35 tokenizer.model
|
| 39 |
+
-rw-rw-r-- 1 root root 27K Feb 23 15:35 tokenizer_config.json
|
| 40 |
+
[Mon Feb 23 15:45:18 UTC 2026] INFO: Model source size: 4.7G
|
| 41 |
+
[Mon Feb 23 15:45:18 UTC 2026] INFO: STEP 1 DONE: Model at /workspace/model-vptq — 8 files, 4.7G total, 1 safetensors
|
| 42 |
+
total 4.7G
|
| 43 |
+
drwxrwxr-x 2 root root 4.0K Feb 23 15:45 .
|
| 44 |
+
drwxrwxr-x 11 root root 4.0K Feb 23 15:45 ..
|
| 45 |
+
-rw-rw-r-- 1 root root 3.5K Feb 23 15:45 added_tokens.json
|
| 46 |
+
-rw-rw-r-- 1 root root 206K Feb 23 15:45 config.json
|
| 47 |
+
-rw-rw-r-- 1 root root 153 Feb 23 15:45 generation_config.json
|
| 48 |
+
-rw-rw-r-- 1 root root 4.7G Feb 23 15:45 model.safetensors
|
| 49 |
+
-rw-rw-r-- 1 root root 3.4K Feb 23 15:45 special_tokens_map.json
|
| 50 |
+
-rw-rw-r-- 1 root root 3.4M Feb 23 15:45 tokenizer.json
|
| 51 |
+
-rw-rw-r-- 1 root root 482K Feb 23 15:45 tokenizer.model
|
| 52 |
+
-rw-rw-r-- 1 root root 27K Feb 23 15:45 tokenizer_config.json
|
| 53 |
+
[Mon Feb 23 15:45:18 UTC 2026] INFO: STEP 2: Uploading model to HuggingFace (PRIORITY)...
|
| 54 |
+
[Mon Feb 23 15:46:01 UTC 2026] INFO: STEP 2 DONE: Model upload completed
|
| 55 |
+
[Mon Feb 23 15:46:01 UTC 2026] INFO: STEP 3: Uploading logs to HuggingFace...
|
| 56 |
+
[Mon Feb 23 15:46:03 UTC 2026] INFO: STEP 3 DONE
|
| 57 |
+
[Mon Feb 23 15:46:03 UTC 2026] INFO: STEP 4: Testing generation...
|
| 58 |
+
[Mon Feb 23 15:46:22 UTC 2026] INFO: STEP 4 DONE
|
| 59 |
+
[Mon Feb 23 15:46:22 UTC 2026] INFO: STEP 5: Running eval 5-shot MC (10 tasks)...
|
| 60 |
+
[Mon Feb 23 15:46:32 UTC 2026] INFO: STEP 5 DONE
|
| 61 |
+
[Mon Feb 23 15:46:32 UTC 2026] INFO: STEP 6: Running eval 5-shot GEN (12 tasks)...
|
| 62 |
+
[Mon Feb 23 15:46:42 UTC 2026] INFO: STEP 6 DONE
|
| 63 |
+
[Mon Feb 23 15:46:42 UTC 2026] INFO: STEP 7: Running eval 0-shot MC (10 tasks)...
|
| 64 |
+
[Mon Feb 23 15:46:53 UTC 2026] INFO: STEP 7 DONE
|
| 65 |
+
[Mon Feb 23 15:46:53 UTC 2026] INFO: STEP 8: Running eval 0-shot GEN (12 tasks)...
|
| 66 |
+
[Mon Feb 23 15:47:03 UTC 2026] INFO: STEP 8 DONE
|
| 67 |
+
[Mon Feb 23 15:47:03 UTC 2026] INFO: STEP 9: Uploading ALL results to HuggingFace...
|
variant-e/post_quant_full.log
ADDED
|
@@ -0,0 +1,1190 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 0 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
| 1 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
| 2 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
| 3 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
| 4 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 5 |
...el-vptq/model.safetensors: 0%| | 526kB / 5.00GB [A[A[A
|
|
|
|
|
|
|
| 6 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 7 |
...el-vptq/model.safetensors: 0%| | 526kB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 8 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 9 |
...el-vptq/model.safetensors: 0%| | 2.63MB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 10 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 11 |
...el-vptq/model.safetensors: 0%| | 7.36MB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 12 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 13 |
...el-vptq/model.safetensors: 0%| | 15.2MB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 14 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 15 |
...el-vptq/model.safetensors: 0%| | 24.2MB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 16 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 17 |
...el-vptq/model.safetensors: 1%| | 38.4MB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 18 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 19 |
...el-vptq/model.safetensors: 1%| | 53.6MB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 20 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 21 |
...el-vptq/model.safetensors: 1%|▏ | 70.4MB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 22 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 23 |
...el-vptq/model.safetensors: 2%|▏ | 91.9MB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 24 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 25 |
...el-vptq/model.safetensors: 2%|▏ | 114MB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 26 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 27 |
...el-vptq/model.safetensors: 3%|▎ | 134MB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 28 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 29 |
...el-vptq/model.safetensors: 3%|▎ | 152MB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 30 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 31 |
...el-vptq/model.safetensors: 4%|▎ | 178MB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 32 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 33 |
...el-vptq/model.safetensors: 4%|▍ | 195MB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 34 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 35 |
...el-vptq/model.safetensors: 5%|▍ | 226MB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 36 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 37 |
...el-vptq/model.safetensors: 5%|▌ | 262MB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 38 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 39 |
...el-vptq/model.safetensors: 6%|▌ | 299MB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 40 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 41 |
...el-vptq/model.safetensors: 7%|▋ | 333MB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 42 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 43 |
...el-vptq/model.safetensors: 8%|▊ | 376MB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 44 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 45 |
...el-vptq/model.safetensors: 8%|▊ | 424MB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 46 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 47 |
...el-vptq/model.safetensors: 9%|▉ | 452MB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 48 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 49 |
...el-vptq/model.safetensors: 10%|▉ | 481MB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 50 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 51 |
...el-vptq/model.safetensors: 10%|█ | 514MB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 52 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 53 |
...el-vptq/model.safetensors: 11%|█ | 545MB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 54 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 55 |
...el-vptq/model.safetensors: 12%|█▏ | 575MB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 56 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 57 |
...el-vptq/model.safetensors: 12%|█▏ | 610MB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 58 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 59 |
...el-vptq/model.safetensors: 13%|█▎ | 654MB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 60 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 61 |
...el-vptq/model.safetensors: 14%|█▎ | 686MB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 62 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 63 |
...el-vptq/model.safetensors: 14%|█▍ | 719MB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 64 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 65 |
...el-vptq/model.safetensors: 15%|█▌ | 767MB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 66 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 67 |
...el-vptq/model.safetensors: 16%|█▋ | 818MB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 68 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 69 |
...el-vptq/model.safetensors: 17%|█▋ | 865MB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 70 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 71 |
...el-vptq/model.safetensors: 18%|█▊ | 910MB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 72 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 73 |
...el-vptq/model.safetensors: 19%|█▉ | 945MB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 74 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 75 |
...el-vptq/model.safetensors: 19%|█▉ | 974MB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 76 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 77 |
...el-vptq/model.safetensors: 20%|██ | 1.00GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 78 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 79 |
...el-vptq/model.safetensors: 21%|██ | 1.04GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 80 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 81 |
...el-vptq/model.safetensors: 21%|██▏ | 1.07GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 82 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 83 |
...el-vptq/model.safetensors: 22%|██▏ | 1.10GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 84 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 85 |
...el-vptq/model.safetensors: 23%|██▎ | 1.14GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 86 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 87 |
...el-vptq/model.safetensors: 24%|██▎ | 1.18GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 88 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 89 |
...el-vptq/model.safetensors: 24%|██▍ | 1.22GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 90 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 91 |
...el-vptq/model.safetensors: 25%|██▌ | 1.27GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 92 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 93 |
...el-vptq/model.safetensors: 26%|██▌ | 1.31GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 94 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 95 |
...el-vptq/model.safetensors: 27%|██▋ | 1.34GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 96 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 97 |
...el-vptq/model.safetensors: 27%|██▋ | 1.37GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 98 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 99 |
...el-vptq/model.safetensors: 28%|██▊ | 1.41GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 100 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 101 |
...el-vptq/model.safetensors: 29%|██▊ | 1.43GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 102 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 103 |
...el-vptq/model.safetensors: 29%|██▉ | 1.45GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 104 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 105 |
...el-vptq/model.safetensors: 29%|██▉ | 1.47GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 106 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 107 |
...el-vptq/model.safetensors: 30%|██▉ | 1.50GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 108 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 109 |
...el-vptq/model.safetensors: 30%|███ | 1.52GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 110 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 111 |
...el-vptq/model.safetensors: 31%|███ | 1.55GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 112 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 113 |
...el-vptq/model.safetensors: 32%|███▏ | 1.59GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 114 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 115 |
...el-vptq/model.safetensors: 33%|███▎ | 1.63GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 116 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 117 |
...el-vptq/model.safetensors: 33%|███▎ | 1.67GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 118 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 119 |
...el-vptq/model.safetensors: 34%|███▍ | 1.72GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 120 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 121 |
...el-vptq/model.safetensors: 35%|███▌ | 1.76GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 122 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 123 |
...el-vptq/model.safetensors: 36%|███▌ | 1.79GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 124 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 125 |
...el-vptq/model.safetensors: 37%|███▋ | 1.83GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 126 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 127 |
...el-vptq/model.safetensors: 37%|███▋ | 1.85GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 128 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 129 |
...el-vptq/model.safetensors: 37%|███▋ | 1.87GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 130 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 131 |
...el-vptq/model.safetensors: 38%|███▊ | 1.89GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 132 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 133 |
...el-vptq/model.safetensors: 38%|███▊ | 1.92GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 134 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 135 |
...el-vptq/model.safetensors: 39%|███▉ | 1.96GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 136 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 137 |
...el-vptq/model.safetensors: 40%|███▉ | 1.99GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 138 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 139 |
...el-vptq/model.safetensors: 41%|████ | 2.03GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 140 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 141 |
...el-vptq/model.safetensors: 41%|████▏ | 2.07GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 142 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 143 |
...el-vptq/model.safetensors: 42%|████▏ | 2.11GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 144 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 145 |
...el-vptq/model.safetensors: 43%|████▎ | 2.16GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 146 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 147 |
...el-vptq/model.safetensors: 44%|████▍ | 2.19GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 148 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 149 |
...el-vptq/model.safetensors: 44%|████▍ | 2.22GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 150 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 151 |
...el-vptq/model.safetensors: 45%|████▌ | 2.26GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 152 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 153 |
...el-vptq/model.safetensors: 46%|████▌ | 2.29GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 154 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 155 |
...el-vptq/model.safetensors: 47%|████▋ | 2.33GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 156 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 157 |
...el-vptq/model.safetensors: 47%|████▋ | 2.36GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 158 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 159 |
...el-vptq/model.safetensors: 48%|████▊ | 2.39GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 160 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 161 |
...el-vptq/model.safetensors: 49%|████▊ | 2.43GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 162 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 163 |
...el-vptq/model.safetensors: 49%|████▉ | 2.46GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 164 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 165 |
...el-vptq/model.safetensors: 50%|████▉ | 2.48GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 166 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 167 |
...el-vptq/model.safetensors: 50%|█████ | 2.50GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 168 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 169 |
...el-vptq/model.safetensors: 51%|█████ | 2.53GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 170 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 171 |
...el-vptq/model.safetensors: 52%|█████▏ | 2.57GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 172 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 173 |
...el-vptq/model.safetensors: 52%|█████▏ | 2.62GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 174 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 175 |
...el-vptq/model.safetensors: 53%|█████▎ | 2.66GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 176 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 177 |
...el-vptq/model.safetensors: 54%|█████▍ | 2.71GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 178 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 179 |
...el-vptq/model.safetensors: 55%|█████▌ | 2.75GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 180 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 181 |
...el-vptq/model.safetensors: 56%|█████▌ | 2.79GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 182 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 183 |
...el-vptq/model.safetensors: 57%|█████▋ | 2.83GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 184 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 185 |
...el-vptq/model.safetensors: 57%|█████▋ | 2.85GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 186 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 187 |
...el-vptq/model.safetensors: 57%|█████▋ | 2.87GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 188 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 189 |
...el-vptq/model.safetensors: 58%|█████▊ | 2.90GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 190 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 191 |
...el-vptq/model.safetensors: 59%|█████▊ | 2.92GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 192 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 193 |
...el-vptq/model.safetensors: 59%|█████▉ | 2.95GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 194 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 195 |
...el-vptq/model.safetensors: 60%|█████▉ | 2.98GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 196 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 197 |
...el-vptq/model.safetensors: 60%|██████ | 3.01GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 198 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 199 |
...el-vptq/model.safetensors: 61%|██████ | 3.05GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 200 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 201 |
...el-vptq/model.safetensors: 62%|██████▏ | 3.09GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 202 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 203 |
...el-vptq/model.safetensors: 63%|██████▎ | 3.13GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 204 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 205 |
...el-vptq/model.safetensors: 63%|██████▎ | 3.17GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 206 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 207 |
...el-vptq/model.safetensors: 64%|██████▍ | 3.21GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 208 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 209 |
...el-vptq/model.safetensors: 65%|██████▌ | 3.25GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 210 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 211 |
...el-vptq/model.safetensors: 66%|██████▌ | 3.30GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 212 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 213 |
...el-vptq/model.safetensors: 67%|██████▋ | 3.32GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 214 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 215 |
...el-vptq/model.safetensors: 67%|██████▋ | 3.35GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 216 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 217 |
...el-vptq/model.safetensors: 67%|██████▋ | 3.36GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 218 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 219 |
...el-vptq/model.safetensors: 68%|██████▊ | 3.37GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 220 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 221 |
...el-vptq/model.safetensors: 68%|██████▊ | 3.39GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 222 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 223 |
...el-vptq/model.safetensors: 68%|██████▊ | 3.42GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 224 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 225 |
...el-vptq/model.safetensors: 69%|██████▉ | 3.45GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 226 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 227 |
...el-vptq/model.safetensors: 70%|██████▉ | 3.48GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 228 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 229 |
...el-vptq/model.safetensors: 70%|███████ | 3.52GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 230 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 231 |
...el-vptq/model.safetensors: 71%|███████ | 3.56GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 232 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 233 |
...el-vptq/model.safetensors: 72%|███████▏ | 3.60GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 234 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 235 |
...el-vptq/model.safetensors: 73%|███████▎ | 3.65GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 236 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 237 |
...el-vptq/model.safetensors: 74%|███████▎ | 3.68GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 238 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 239 |
...el-vptq/model.safetensors: 74%|███████▍ | 3.71GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 240 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 241 |
...el-vptq/model.safetensors: 75%|███████▌ | 3.75GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 242 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 243 |
...el-vptq/model.safetensors: 76%|███████▌ | 3.78GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 244 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 245 |
...el-vptq/model.safetensors: 76%|███████▋ | 3.82GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 246 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 247 |
...el-vptq/model.safetensors: 77%|███████▋ | 3.85GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 248 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 249 |
...el-vptq/model.safetensors: 78%|███████▊ | 3.87GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 250 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 251 |
...el-vptq/model.safetensors: 78%|███████▊ | 3.90GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 252 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 253 |
...el-vptq/model.safetensors: 78%|███████▊ | 3.92GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 254 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 255 |
...el-vptq/model.safetensors: 79%|███████▉ | 3.95GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 256 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 257 |
...el-vptq/model.safetensors: 79%|███████▉ | 3.96GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 258 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 259 |
...el-vptq/model.safetensors: 80%|███████▉ | 3.99GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 260 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 261 |
...el-vptq/model.safetensors: 80%|████████ | 4.01GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 262 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 263 |
...el-vptq/model.safetensors: 81%|████████ | 4.04GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 264 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 265 |
...el-vptq/model.safetensors: 81%|████████▏ | 4.06GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 266 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 267 |
...el-vptq/model.safetensors: 82%|████████▏ | 4.10GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 268 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 269 |
...el-vptq/model.safetensors: 83%|████████▎ | 4.13GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 270 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 271 |
...el-vptq/model.safetensors: 83%|████████▎ | 4.16GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 272 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 273 |
...el-vptq/model.safetensors: 84%|████████▎ | 4.18GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 274 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 275 |
...el-vptq/model.safetensors: 85%|████████▍ | 4.22GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 276 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 277 |
...el-vptq/model.safetensors: 85%|████████▌ | 4.25GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 278 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 279 |
...el-vptq/model.safetensors: 86%|████████▌ | 4.28GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 280 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 281 |
...el-vptq/model.safetensors: 86%|████████▌ | 4.31GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 282 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 283 |
...el-vptq/model.safetensors: 87%|████████▋ | 4.34GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 284 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 285 |
...el-vptq/model.safetensors: 87%|████████▋ | 4.36GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 286 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 287 |
...el-vptq/model.safetensors: 88%|████████▊ | 4.40GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 288 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 289 |
...el-vptq/model.safetensors: 89%|████████▊ | 4.43GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 290 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 291 |
...el-vptq/model.safetensors: 89%|████████▉ | 4.45GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 292 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 293 |
...el-vptq/model.safetensors: 90%|████████▉ | 4.48GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 294 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 295 |
...el-vptq/model.safetensors: 90%|█████████ | 4.51GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 296 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 297 |
...el-vptq/model.safetensors: 91%|█████████ | 4.55GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 298 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 299 |
...el-vptq/model.safetensors: 92%|█████████▏| 4.59GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 300 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 301 |
...el-vptq/model.safetensors: 93%|█████████▎| 4.62GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 302 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 303 |
...el-vptq/model.safetensors: 93%|█████████▎| 4.65GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 304 |
...odel-vptq/tokenizer.model: 100%|████████���█| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 305 |
...el-vptq/model.safetensors: 94%|█████████▍| 4.69GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 306 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 307 |
...el-vptq/model.safetensors: 95%|█████████▍| 4.72GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 308 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 309 |
...el-vptq/model.safetensors: 95%|█████████▌| 4.76GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 310 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 311 |
...el-vptq/model.safetensors: 96%|█████████▌| 4.79GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 312 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 313 |
...el-vptq/model.safetensors: 96%|█████████▋| 4.82GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 314 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 315 |
...el-vptq/model.safetensors: 97%|█████████▋| 4.85GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 316 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 317 |
...el-vptq/model.safetensors: 98%|█████████▊| 4.90GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 318 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 319 |
...el-vptq/model.safetensors: 98%|█████████▊| 4.92GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 320 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 321 |
...el-vptq/model.safetensors: 99%|█████████▉| 4.94GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 322 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 323 |
...el-vptq/model.safetensors: 99%|█████████▉| 4.96GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 324 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 325 |
...el-vptq/model.safetensors: 100%|█████████▉| 4.97GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 326 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 327 |
...el-vptq/model.safetensors: 100%|█████████▉| 4.98GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 328 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 329 |
...el-vptq/model.safetensors: 100%|██████��██▉| 4.98GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 330 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 331 |
...el-vptq/model.safetensors: 100%|█████████▉| 4.99GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 332 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 333 |
...el-vptq/model.safetensors: 100%|█████████▉| 4.99GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 334 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 335 |
...el-vptq/model.safetensors: 100%|█████████▉| 5.00GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 336 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 337 |
...el-vptq/model.safetensors: 100%|█████████▉| 5.00GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
| 338 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 339 |
...el-vptq/model.safetensors: 100%|█████████▉| 5.00GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
| 340 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 341 |
...el-vptq/model.safetensors: 100%|█████████▉| 5.00GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 342 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 343 |
...el-vptq/model.safetensors: 100%|█████████▉| 5.00GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
|
|
|
| 344 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 345 |
...el-vptq/model.safetensors: 100%|█████████▉| 5.00GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
| 346 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 347 |
...el-vptq/model.safetensors: 100%|█████████▉| 5.00GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
| 348 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 349 |
...el-vptq/model.safetensors: 100%|█████████▉| 5.00GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
| 350 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 351 |
...el-vptq/model.safetensors: 100%|█████████▉| 5.00GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
| 352 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 353 |
...el-vptq/model.safetensors: 100%|█████████▉| 5.00GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
| 354 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
|
|
|
|
|
|
|
|
|
| 355 |
...el-vptq/model.safetensors: 100%|█████████▉| 5.00GB / 5.00GB [A[A[A
|
|
|
|
|
|
|
| 356 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB
|
|
|
|
| 357 |
...el-vptq/model.safetensors: 100%|█████████▉| 5.00GB / 5.00GB
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
[Mon Feb 23 15:45:18 UTC 2026] INFO: === POST-QUANTIZATION AUTOMATION STARTED ===
|
| 2 |
+
[Mon Feb 23 15:45:18 UTC 2026] INFO: Output directory: /workspace/variant-e/output/2026-02-23-08-06-48/
|
| 3 |
+
[Mon Feb 23 15:45:18 UTC 2026] INFO: Packed model path: /workspace/variant-e/output/2026-02-23-08-06-48/packed_model
|
| 4 |
+
[Mon Feb 23 15:45:18 UTC 2026] INFO: Unpacked model path: /workspace/variant-e/output/2026-02-23-08-06-48/model
|
| 5 |
+
[Mon Feb 23 15:45:18 UTC 2026] INFO: STEP 1: Verifying quantized model output...
|
| 6 |
+
[Mon Feb 23 15:45:18 UTC 2026] INFO: Output dir contents:
|
| 7 |
+
[Mon Feb 23 15:45:18 UTC 2026] INFO: Using PACKED model from: /workspace/variant-e/output/2026-02-23-08-06-48/packed_model
|
| 8 |
+
[Mon Feb 23 15:45:18 UTC 2026] INFO: Model files found:
|
| 9 |
+
total 4.7G
|
| 10 |
+
drwxrwxr-x 2 root root 4.0K Feb 23 15:35 .
|
| 11 |
+
drwxrwxr-x 5 root root 95 Feb 23 15:43 ..
|
| 12 |
+
-rw-rw-r-- 1 root root 3.5K Feb 23 15:35 added_tokens.json
|
| 13 |
+
-rw-rw-r-- 1 root root 206K Feb 23 15:35 config.json
|
| 14 |
+
-rw-rw-r-- 1 root root 153 Feb 23 15:35 generation_config.json
|
| 15 |
+
-rw-rw-r-- 1 root root 4.7G Feb 23 15:35 model.safetensors
|
| 16 |
+
-rw-rw-r-- 1 root root 3.4K Feb 23 15:35 special_tokens_map.json
|
| 17 |
+
-rw-rw-r-- 1 root root 3.4M Feb 23 15:35 tokenizer.json
|
| 18 |
+
-rw-rw-r-- 1 root root 482K Feb 23 15:35 tokenizer.model
|
| 19 |
+
-rw-rw-r-- 1 root root 27K Feb 23 15:35 tokenizer_config.json
|
| 20 |
+
[Mon Feb 23 15:45:18 UTC 2026] INFO: Model source size: 4.7G
|
| 21 |
+
[Mon Feb 23 15:45:18 UTC 2026] INFO: STEP 1 DONE: Model at /workspace/model-vptq — 8 files, 4.7G total, 1 safetensors
|
| 22 |
+
[Mon Feb 23 15:45:18 UTC 2026] INFO: STEP 2: Uploading model to HuggingFace (PRIORITY)...
|
| 23 |
+
|
| 24 |
+
|
| 25 |
+
|
| 26 |
+
|
| 27 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 28 |
+
|
| 29 |
+
|
| 30 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 31 |
+
|
| 32 |
+
|
| 33 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 34 |
+
|
| 35 |
+
|
| 36 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 37 |
+
|
| 38 |
+
|
| 39 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 40 |
+
|
| 41 |
+
|
| 42 |
+
|
| 43 |
...el-vptq/model.safetensors: 0%| | 526kB / 5.00GB [A[A[A
|
| 44 |
+
|
| 45 |
+
|
| 46 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 47 |
+
|
| 48 |
+
|
| 49 |
+
|
| 50 |
...el-vptq/model.safetensors: 0%| | 526kB / 5.00GB [A[A[A
|
| 51 |
+
|
| 52 |
+
|
| 53 |
+
|
| 54 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 55 |
+
|
| 56 |
+
|
| 57 |
+
|
| 58 |
...el-vptq/model.safetensors: 0%| | 2.63MB / 5.00GB [A[A[A
|
| 59 |
+
|
| 60 |
+
|
| 61 |
+
|
| 62 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 63 |
+
|
| 64 |
+
|
| 65 |
+
|
| 66 |
...el-vptq/model.safetensors: 0%| | 7.36MB / 5.00GB [A[A[A
|
| 67 |
+
|
| 68 |
+
|
| 69 |
+
|
| 70 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 71 |
+
|
| 72 |
+
|
| 73 |
+
|
| 74 |
...el-vptq/model.safetensors: 0%| | 15.2MB / 5.00GB [A[A[A
|
| 75 |
+
|
| 76 |
+
|
| 77 |
+
|
| 78 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 79 |
+
|
| 80 |
+
|
| 81 |
+
|
| 82 |
...el-vptq/model.safetensors: 0%| | 24.2MB / 5.00GB [A[A[A
|
| 83 |
+
|
| 84 |
+
|
| 85 |
+
|
| 86 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 87 |
+
|
| 88 |
+
|
| 89 |
+
|
| 90 |
...el-vptq/model.safetensors: 1%| | 38.4MB / 5.00GB [A[A[A
|
| 91 |
+
|
| 92 |
+
|
| 93 |
+
|
| 94 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 95 |
+
|
| 96 |
+
|
| 97 |
+
|
| 98 |
...el-vptq/model.safetensors: 1%| | 53.6MB / 5.00GB [A[A[A
|
| 99 |
+
|
| 100 |
+
|
| 101 |
+
|
| 102 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 103 |
+
|
| 104 |
+
|
| 105 |
+
|
| 106 |
...el-vptq/model.safetensors: 1%|▏ | 70.4MB / 5.00GB [A[A[A
|
| 107 |
+
|
| 108 |
+
|
| 109 |
+
|
| 110 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 111 |
+
|
| 112 |
+
|
| 113 |
+
|
| 114 |
...el-vptq/model.safetensors: 2%|▏ | 91.9MB / 5.00GB [A[A[A
|
| 115 |
+
|
| 116 |
+
|
| 117 |
+
|
| 118 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 119 |
+
|
| 120 |
+
|
| 121 |
+
|
| 122 |
...el-vptq/model.safetensors: 2%|▏ | 114MB / 5.00GB [A[A[A
|
| 123 |
+
|
| 124 |
+
|
| 125 |
+
|
| 126 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 127 |
+
|
| 128 |
+
|
| 129 |
+
|
| 130 |
...el-vptq/model.safetensors: 3%|▎ | 134MB / 5.00GB [A[A[A
|
| 131 |
+
|
| 132 |
+
|
| 133 |
+
|
| 134 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 135 |
+
|
| 136 |
+
|
| 137 |
+
|
| 138 |
...el-vptq/model.safetensors: 3%|▎ | 152MB / 5.00GB [A[A[A
|
| 139 |
+
|
| 140 |
+
|
| 141 |
+
|
| 142 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 143 |
+
|
| 144 |
+
|
| 145 |
+
|
| 146 |
...el-vptq/model.safetensors: 4%|▎ | 178MB / 5.00GB [A[A[A
|
| 147 |
+
|
| 148 |
+
|
| 149 |
+
|
| 150 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 151 |
+
|
| 152 |
+
|
| 153 |
+
|
| 154 |
...el-vptq/model.safetensors: 4%|▍ | 195MB / 5.00GB [A[A[A
|
| 155 |
+
|
| 156 |
+
|
| 157 |
+
|
| 158 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 159 |
+
|
| 160 |
+
|
| 161 |
+
|
| 162 |
...el-vptq/model.safetensors: 5%|▍ | 226MB / 5.00GB [A[A[A
|
| 163 |
+
|
| 164 |
+
|
| 165 |
+
|
| 166 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 167 |
+
|
| 168 |
+
|
| 169 |
+
|
| 170 |
...el-vptq/model.safetensors: 5%|▌ | 262MB / 5.00GB [A[A[A
|
| 171 |
+
|
| 172 |
+
|
| 173 |
+
|
| 174 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 175 |
+
|
| 176 |
+
|
| 177 |
+
|
| 178 |
...el-vptq/model.safetensors: 6%|▌ | 299MB / 5.00GB [A[A[A
|
| 179 |
+
|
| 180 |
+
|
| 181 |
+
|
| 182 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 183 |
+
|
| 184 |
+
|
| 185 |
+
|
| 186 |
...el-vptq/model.safetensors: 7%|▋ | 333MB / 5.00GB [A[A[A
|
| 187 |
+
|
| 188 |
+
|
| 189 |
+
|
| 190 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 191 |
+
|
| 192 |
+
|
| 193 |
+
|
| 194 |
...el-vptq/model.safetensors: 8%|▊ | 376MB / 5.00GB [A[A[A
|
| 195 |
+
|
| 196 |
+
|
| 197 |
+
|
| 198 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 199 |
+
|
| 200 |
+
|
| 201 |
+
|
| 202 |
...el-vptq/model.safetensors: 8%|▊ | 424MB / 5.00GB [A[A[A
|
| 203 |
+
|
| 204 |
+
|
| 205 |
+
|
| 206 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 207 |
+
|
| 208 |
+
|
| 209 |
+
|
| 210 |
...el-vptq/model.safetensors: 9%|▉ | 452MB / 5.00GB [A[A[A
|
| 211 |
+
|
| 212 |
+
|
| 213 |
+
|
| 214 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 215 |
+
|
| 216 |
+
|
| 217 |
+
|
| 218 |
...el-vptq/model.safetensors: 10%|▉ | 481MB / 5.00GB [A[A[A
|
| 219 |
+
|
| 220 |
+
|
| 221 |
+
|
| 222 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 223 |
+
|
| 224 |
+
|
| 225 |
+
|
| 226 |
...el-vptq/model.safetensors: 10%|█ | 514MB / 5.00GB [A[A[A
|
| 227 |
+
|
| 228 |
+
|
| 229 |
+
|
| 230 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 231 |
+
|
| 232 |
+
|
| 233 |
+
|
| 234 |
...el-vptq/model.safetensors: 11%|█ | 545MB / 5.00GB [A[A[A
|
| 235 |
+
|
| 236 |
+
|
| 237 |
+
|
| 238 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 239 |
+
|
| 240 |
+
|
| 241 |
+
|
| 242 |
...el-vptq/model.safetensors: 12%|█▏ | 575MB / 5.00GB [A[A[A
|
| 243 |
+
|
| 244 |
+
|
| 245 |
+
|
| 246 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 247 |
+
|
| 248 |
+
|
| 249 |
+
|
| 250 |
...el-vptq/model.safetensors: 12%|█▏ | 610MB / 5.00GB [A[A[A
|
| 251 |
+
|
| 252 |
+
|
| 253 |
+
|
| 254 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 255 |
+
|
| 256 |
+
|
| 257 |
+
|
| 258 |
...el-vptq/model.safetensors: 13%|█▎ | 654MB / 5.00GB [A[A[A
|
| 259 |
+
|
| 260 |
+
|
| 261 |
+
|
| 262 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 263 |
+
|
| 264 |
+
|
| 265 |
+
|
| 266 |
...el-vptq/model.safetensors: 14%|█▎ | 686MB / 5.00GB [A[A[A
|
| 267 |
+
|
| 268 |
+
|
| 269 |
+
|
| 270 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 271 |
+
|
| 272 |
+
|
| 273 |
+
|
| 274 |
...el-vptq/model.safetensors: 14%|█▍ | 719MB / 5.00GB [A[A[A
|
| 275 |
+
|
| 276 |
+
|
| 277 |
+
|
| 278 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 279 |
+
|
| 280 |
+
|
| 281 |
+
|
| 282 |
...el-vptq/model.safetensors: 15%|█▌ | 767MB / 5.00GB [A[A[A
|
| 283 |
+
|
| 284 |
+
|
| 285 |
+
|
| 286 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 287 |
+
|
| 288 |
+
|
| 289 |
+
|
| 290 |
...el-vptq/model.safetensors: 16%|█▋ | 818MB / 5.00GB [A[A[A
|
| 291 |
+
|
| 292 |
+
|
| 293 |
+
|
| 294 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 295 |
+
|
| 296 |
+
|
| 297 |
+
|
| 298 |
...el-vptq/model.safetensors: 17%|█▋ | 865MB / 5.00GB [A[A[A
|
| 299 |
+
|
| 300 |
+
|
| 301 |
+
|
| 302 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 303 |
+
|
| 304 |
+
|
| 305 |
+
|
| 306 |
...el-vptq/model.safetensors: 18%|█▊ | 910MB / 5.00GB [A[A[A
|
| 307 |
+
|
| 308 |
+
|
| 309 |
+
|
| 310 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 311 |
+
|
| 312 |
+
|
| 313 |
+
|
| 314 |
...el-vptq/model.safetensors: 19%|█▉ | 945MB / 5.00GB [A[A[A
|
| 315 |
+
|
| 316 |
+
|
| 317 |
+
|
| 318 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 319 |
+
|
| 320 |
+
|
| 321 |
+
|
| 322 |
...el-vptq/model.safetensors: 19%|█▉ | 974MB / 5.00GB [A[A[A
|
| 323 |
+
|
| 324 |
+
|
| 325 |
+
|
| 326 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 327 |
+
|
| 328 |
+
|
| 329 |
+
|
| 330 |
...el-vptq/model.safetensors: 20%|██ | 1.00GB / 5.00GB [A[A[A
|
| 331 |
+
|
| 332 |
+
|
| 333 |
+
|
| 334 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 335 |
+
|
| 336 |
+
|
| 337 |
+
|
| 338 |
...el-vptq/model.safetensors: 21%|██ | 1.04GB / 5.00GB [A[A[A
|
| 339 |
+
|
| 340 |
+
|
| 341 |
+
|
| 342 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 343 |
+
|
| 344 |
+
|
| 345 |
+
|
| 346 |
...el-vptq/model.safetensors: 21%|██▏ | 1.07GB / 5.00GB [A[A[A
|
| 347 |
+
|
| 348 |
+
|
| 349 |
+
|
| 350 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 351 |
+
|
| 352 |
+
|
| 353 |
+
|
| 354 |
...el-vptq/model.safetensors: 22%|██▏ | 1.10GB / 5.00GB [A[A[A
|
| 355 |
+
|
| 356 |
+
|
| 357 |
+
|
| 358 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 359 |
+
|
| 360 |
+
|
| 361 |
+
|
| 362 |
...el-vptq/model.safetensors: 23%|██▎ | 1.14GB / 5.00GB [A[A[A
|
| 363 |
+
|
| 364 |
+
|
| 365 |
+
|
| 366 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 367 |
+
|
| 368 |
+
|
| 369 |
+
|
| 370 |
...el-vptq/model.safetensors: 24%|██▎ | 1.18GB / 5.00GB [A[A[A
|
| 371 |
+
|
| 372 |
+
|
| 373 |
+
|
| 374 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 375 |
+
|
| 376 |
+
|
| 377 |
+
|
| 378 |
...el-vptq/model.safetensors: 24%|██▍ | 1.22GB / 5.00GB [A[A[A
|
| 379 |
+
|
| 380 |
+
|
| 381 |
+
|
| 382 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 383 |
+
|
| 384 |
+
|
| 385 |
+
|
| 386 |
...el-vptq/model.safetensors: 25%|██▌ | 1.27GB / 5.00GB [A[A[A
|
| 387 |
+
|
| 388 |
+
|
| 389 |
+
|
| 390 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 391 |
+
|
| 392 |
+
|
| 393 |
+
|
| 394 |
...el-vptq/model.safetensors: 26%|██▌ | 1.31GB / 5.00GB [A[A[A
|
| 395 |
+
|
| 396 |
+
|
| 397 |
+
|
| 398 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 399 |
+
|
| 400 |
+
|
| 401 |
+
|
| 402 |
...el-vptq/model.safetensors: 27%|██▋ | 1.34GB / 5.00GB [A[A[A
|
| 403 |
+
|
| 404 |
+
|
| 405 |
+
|
| 406 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 407 |
+
|
| 408 |
+
|
| 409 |
+
|
| 410 |
...el-vptq/model.safetensors: 27%|██▋ | 1.37GB / 5.00GB [A[A[A
|
| 411 |
+
|
| 412 |
+
|
| 413 |
+
|
| 414 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 415 |
+
|
| 416 |
+
|
| 417 |
+
|
| 418 |
...el-vptq/model.safetensors: 28%|██▊ | 1.41GB / 5.00GB [A[A[A
|
| 419 |
+
|
| 420 |
+
|
| 421 |
+
|
| 422 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 423 |
+
|
| 424 |
+
|
| 425 |
+
|
| 426 |
...el-vptq/model.safetensors: 29%|██▊ | 1.43GB / 5.00GB [A[A[A
|
| 427 |
+
|
| 428 |
+
|
| 429 |
+
|
| 430 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 431 |
+
|
| 432 |
+
|
| 433 |
+
|
| 434 |
...el-vptq/model.safetensors: 29%|██▉ | 1.45GB / 5.00GB [A[A[A
|
| 435 |
+
|
| 436 |
+
|
| 437 |
+
|
| 438 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 439 |
+
|
| 440 |
+
|
| 441 |
+
|
| 442 |
...el-vptq/model.safetensors: 29%|██▉ | 1.47GB / 5.00GB [A[A[A
|
| 443 |
+
|
| 444 |
+
|
| 445 |
+
|
| 446 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 447 |
+
|
| 448 |
+
|
| 449 |
+
|
| 450 |
...el-vptq/model.safetensors: 30%|██▉ | 1.50GB / 5.00GB [A[A[A
|
| 451 |
+
|
| 452 |
+
|
| 453 |
+
|
| 454 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 455 |
+
|
| 456 |
+
|
| 457 |
+
|
| 458 |
...el-vptq/model.safetensors: 30%|███ | 1.52GB / 5.00GB [A[A[A
|
| 459 |
+
|
| 460 |
+
|
| 461 |
+
|
| 462 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 463 |
+
|
| 464 |
+
|
| 465 |
+
|
| 466 |
...el-vptq/model.safetensors: 31%|███ | 1.55GB / 5.00GB [A[A[A
|
| 467 |
+
|
| 468 |
+
|
| 469 |
+
|
| 470 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 471 |
+
|
| 472 |
+
|
| 473 |
+
|
| 474 |
...el-vptq/model.safetensors: 32%|███▏ | 1.59GB / 5.00GB [A[A[A
|
| 475 |
+
|
| 476 |
+
|
| 477 |
+
|
| 478 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 479 |
+
|
| 480 |
+
|
| 481 |
+
|
| 482 |
...el-vptq/model.safetensors: 33%|███▎ | 1.63GB / 5.00GB [A[A[A
|
| 483 |
+
|
| 484 |
+
|
| 485 |
+
|
| 486 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 487 |
+
|
| 488 |
+
|
| 489 |
+
|
| 490 |
...el-vptq/model.safetensors: 33%|███▎ | 1.67GB / 5.00GB [A[A[A
|
| 491 |
+
|
| 492 |
+
|
| 493 |
+
|
| 494 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 495 |
+
|
| 496 |
+
|
| 497 |
+
|
| 498 |
...el-vptq/model.safetensors: 34%|███▍ | 1.72GB / 5.00GB [A[A[A
|
| 499 |
+
|
| 500 |
+
|
| 501 |
+
|
| 502 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 503 |
+
|
| 504 |
+
|
| 505 |
+
|
| 506 |
...el-vptq/model.safetensors: 35%|███▌ | 1.76GB / 5.00GB [A[A[A
|
| 507 |
+
|
| 508 |
+
|
| 509 |
+
|
| 510 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 511 |
+
|
| 512 |
+
|
| 513 |
+
|
| 514 |
...el-vptq/model.safetensors: 36%|███▌ | 1.79GB / 5.00GB [A[A[A
|
| 515 |
+
|
| 516 |
+
|
| 517 |
+
|
| 518 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 519 |
+
|
| 520 |
+
|
| 521 |
+
|
| 522 |
...el-vptq/model.safetensors: 37%|███▋ | 1.83GB / 5.00GB [A[A[A
|
| 523 |
+
|
| 524 |
+
|
| 525 |
+
|
| 526 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 527 |
+
|
| 528 |
+
|
| 529 |
+
|
| 530 |
...el-vptq/model.safetensors: 37%|███▋ | 1.85GB / 5.00GB [A[A[A
|
| 531 |
+
|
| 532 |
+
|
| 533 |
+
|
| 534 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 535 |
+
|
| 536 |
+
|
| 537 |
+
|
| 538 |
...el-vptq/model.safetensors: 37%|███▋ | 1.87GB / 5.00GB [A[A[A
|
| 539 |
+
|
| 540 |
+
|
| 541 |
+
|
| 542 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 543 |
+
|
| 544 |
+
|
| 545 |
+
|
| 546 |
...el-vptq/model.safetensors: 38%|███▊ | 1.89GB / 5.00GB [A[A[A
|
| 547 |
+
|
| 548 |
+
|
| 549 |
+
|
| 550 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 551 |
+
|
| 552 |
+
|
| 553 |
+
|
| 554 |
...el-vptq/model.safetensors: 38%|███▊ | 1.92GB / 5.00GB [A[A[A
|
| 555 |
+
|
| 556 |
+
|
| 557 |
+
|
| 558 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 559 |
+
|
| 560 |
+
|
| 561 |
+
|
| 562 |
...el-vptq/model.safetensors: 39%|███▉ | 1.96GB / 5.00GB [A[A[A
|
| 563 |
+
|
| 564 |
+
|
| 565 |
+
|
| 566 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 567 |
+
|
| 568 |
+
|
| 569 |
+
|
| 570 |
...el-vptq/model.safetensors: 40%|███▉ | 1.99GB / 5.00GB [A[A[A
|
| 571 |
+
|
| 572 |
+
|
| 573 |
+
|
| 574 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 575 |
+
|
| 576 |
+
|
| 577 |
+
|
| 578 |
...el-vptq/model.safetensors: 41%|████ | 2.03GB / 5.00GB [A[A[A
|
| 579 |
+
|
| 580 |
+
|
| 581 |
+
|
| 582 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 583 |
+
|
| 584 |
+
|
| 585 |
+
|
| 586 |
...el-vptq/model.safetensors: 41%|████▏ | 2.07GB / 5.00GB [A[A[A
|
| 587 |
+
|
| 588 |
+
|
| 589 |
+
|
| 590 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 591 |
+
|
| 592 |
+
|
| 593 |
+
|
| 594 |
...el-vptq/model.safetensors: 42%|████▏ | 2.11GB / 5.00GB [A[A[A
|
| 595 |
+
|
| 596 |
+
|
| 597 |
+
|
| 598 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 599 |
+
|
| 600 |
+
|
| 601 |
+
|
| 602 |
...el-vptq/model.safetensors: 43%|████▎ | 2.16GB / 5.00GB [A[A[A
|
| 603 |
+
|
| 604 |
+
|
| 605 |
+
|
| 606 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 607 |
+
|
| 608 |
+
|
| 609 |
+
|
| 610 |
...el-vptq/model.safetensors: 44%|████▍ | 2.19GB / 5.00GB [A[A[A
|
| 611 |
+
|
| 612 |
+
|
| 613 |
+
|
| 614 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 615 |
+
|
| 616 |
+
|
| 617 |
+
|
| 618 |
...el-vptq/model.safetensors: 44%|████▍ | 2.22GB / 5.00GB [A[A[A
|
| 619 |
+
|
| 620 |
+
|
| 621 |
+
|
| 622 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 623 |
+
|
| 624 |
+
|
| 625 |
+
|
| 626 |
...el-vptq/model.safetensors: 45%|████▌ | 2.26GB / 5.00GB [A[A[A
|
| 627 |
+
|
| 628 |
+
|
| 629 |
+
|
| 630 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 631 |
+
|
| 632 |
+
|
| 633 |
+
|
| 634 |
...el-vptq/model.safetensors: 46%|████▌ | 2.29GB / 5.00GB [A[A[A
|
| 635 |
+
|
| 636 |
+
|
| 637 |
+
|
| 638 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 639 |
+
|
| 640 |
+
|
| 641 |
+
|
| 642 |
...el-vptq/model.safetensors: 47%|████▋ | 2.33GB / 5.00GB [A[A[A
|
| 643 |
+
|
| 644 |
+
|
| 645 |
+
|
| 646 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 647 |
+
|
| 648 |
+
|
| 649 |
+
|
| 650 |
...el-vptq/model.safetensors: 47%|████▋ | 2.36GB / 5.00GB [A[A[A
|
| 651 |
+
|
| 652 |
+
|
| 653 |
+
|
| 654 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 655 |
+
|
| 656 |
+
|
| 657 |
+
|
| 658 |
...el-vptq/model.safetensors: 48%|████▊ | 2.39GB / 5.00GB [A[A[A
|
| 659 |
+
|
| 660 |
+
|
| 661 |
+
|
| 662 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 663 |
+
|
| 664 |
+
|
| 665 |
+
|
| 666 |
...el-vptq/model.safetensors: 49%|████▊ | 2.43GB / 5.00GB [A[A[A
|
| 667 |
+
|
| 668 |
+
|
| 669 |
+
|
| 670 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 671 |
+
|
| 672 |
+
|
| 673 |
+
|
| 674 |
...el-vptq/model.safetensors: 49%|████▉ | 2.46GB / 5.00GB [A[A[A
|
| 675 |
+
|
| 676 |
+
|
| 677 |
+
|
| 678 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 679 |
+
|
| 680 |
+
|
| 681 |
+
|
| 682 |
...el-vptq/model.safetensors: 50%|████▉ | 2.48GB / 5.00GB [A[A[A
|
| 683 |
+
|
| 684 |
+
|
| 685 |
+
|
| 686 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 687 |
+
|
| 688 |
+
|
| 689 |
+
|
| 690 |
...el-vptq/model.safetensors: 50%|█████ | 2.50GB / 5.00GB [A[A[A
|
| 691 |
+
|
| 692 |
+
|
| 693 |
+
|
| 694 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 695 |
+
|
| 696 |
+
|
| 697 |
+
|
| 698 |
...el-vptq/model.safetensors: 51%|█████ | 2.53GB / 5.00GB [A[A[A
|
| 699 |
+
|
| 700 |
+
|
| 701 |
+
|
| 702 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 703 |
+
|
| 704 |
+
|
| 705 |
+
|
| 706 |
...el-vptq/model.safetensors: 52%|█████▏ | 2.57GB / 5.00GB [A[A[A
|
| 707 |
+
|
| 708 |
+
|
| 709 |
+
|
| 710 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 711 |
+
|
| 712 |
+
|
| 713 |
+
|
| 714 |
...el-vptq/model.safetensors: 52%|█████▏ | 2.62GB / 5.00GB [A[A[A
|
| 715 |
+
|
| 716 |
+
|
| 717 |
+
|
| 718 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 719 |
+
|
| 720 |
+
|
| 721 |
+
|
| 722 |
...el-vptq/model.safetensors: 53%|█████▎ | 2.66GB / 5.00GB [A[A[A
|
| 723 |
+
|
| 724 |
+
|
| 725 |
+
|
| 726 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 727 |
+
|
| 728 |
+
|
| 729 |
+
|
| 730 |
...el-vptq/model.safetensors: 54%|█████▍ | 2.71GB / 5.00GB [A[A[A
|
| 731 |
+
|
| 732 |
+
|
| 733 |
+
|
| 734 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 735 |
+
|
| 736 |
+
|
| 737 |
+
|
| 738 |
...el-vptq/model.safetensors: 55%|█████▌ | 2.75GB / 5.00GB [A[A[A
|
| 739 |
+
|
| 740 |
+
|
| 741 |
+
|
| 742 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 743 |
+
|
| 744 |
+
|
| 745 |
+
|
| 746 |
...el-vptq/model.safetensors: 56%|█████▌ | 2.79GB / 5.00GB [A[A[A
|
| 747 |
+
|
| 748 |
+
|
| 749 |
+
|
| 750 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 751 |
+
|
| 752 |
+
|
| 753 |
+
|
| 754 |
...el-vptq/model.safetensors: 57%|█████▋ | 2.83GB / 5.00GB [A[A[A
|
| 755 |
+
|
| 756 |
+
|
| 757 |
+
|
| 758 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 759 |
+
|
| 760 |
+
|
| 761 |
+
|
| 762 |
...el-vptq/model.safetensors: 57%|█████▋ | 2.85GB / 5.00GB [A[A[A
|
| 763 |
+
|
| 764 |
+
|
| 765 |
+
|
| 766 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 767 |
+
|
| 768 |
+
|
| 769 |
+
|
| 770 |
...el-vptq/model.safetensors: 57%|█████▋ | 2.87GB / 5.00GB [A[A[A
|
| 771 |
+
|
| 772 |
+
|
| 773 |
+
|
| 774 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 775 |
+
|
| 776 |
+
|
| 777 |
+
|
| 778 |
...el-vptq/model.safetensors: 58%|█████▊ | 2.90GB / 5.00GB [A[A[A
|
| 779 |
+
|
| 780 |
+
|
| 781 |
+
|
| 782 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 783 |
+
|
| 784 |
+
|
| 785 |
+
|
| 786 |
...el-vptq/model.safetensors: 59%|█████▊ | 2.92GB / 5.00GB [A[A[A
|
| 787 |
+
|
| 788 |
+
|
| 789 |
+
|
| 790 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 791 |
+
|
| 792 |
+
|
| 793 |
+
|
| 794 |
...el-vptq/model.safetensors: 59%|█████▉ | 2.95GB / 5.00GB [A[A[A
|
| 795 |
+
|
| 796 |
+
|
| 797 |
+
|
| 798 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 799 |
+
|
| 800 |
+
|
| 801 |
+
|
| 802 |
...el-vptq/model.safetensors: 60%|█████▉ | 2.98GB / 5.00GB [A[A[A
|
| 803 |
+
|
| 804 |
+
|
| 805 |
+
|
| 806 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 807 |
+
|
| 808 |
+
|
| 809 |
+
|
| 810 |
...el-vptq/model.safetensors: 60%|██████ | 3.01GB / 5.00GB [A[A[A
|
| 811 |
+
|
| 812 |
+
|
| 813 |
+
|
| 814 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 815 |
+
|
| 816 |
+
|
| 817 |
+
|
| 818 |
...el-vptq/model.safetensors: 61%|██████ | 3.05GB / 5.00GB [A[A[A
|
| 819 |
+
|
| 820 |
+
|
| 821 |
+
|
| 822 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 823 |
+
|
| 824 |
+
|
| 825 |
+
|
| 826 |
...el-vptq/model.safetensors: 62%|██████▏ | 3.09GB / 5.00GB [A[A[A
|
| 827 |
+
|
| 828 |
+
|
| 829 |
+
|
| 830 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 831 |
+
|
| 832 |
+
|
| 833 |
+
|
| 834 |
...el-vptq/model.safetensors: 63%|██████▎ | 3.13GB / 5.00GB [A[A[A
|
| 835 |
+
|
| 836 |
+
|
| 837 |
+
|
| 838 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 839 |
+
|
| 840 |
+
|
| 841 |
+
|
| 842 |
...el-vptq/model.safetensors: 63%|██████▎ | 3.17GB / 5.00GB [A[A[A
|
| 843 |
+
|
| 844 |
+
|
| 845 |
+
|
| 846 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 847 |
+
|
| 848 |
+
|
| 849 |
+
|
| 850 |
...el-vptq/model.safetensors: 64%|██████▍ | 3.21GB / 5.00GB [A[A[A
|
| 851 |
+
|
| 852 |
+
|
| 853 |
+
|
| 854 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 855 |
+
|
| 856 |
+
|
| 857 |
+
|
| 858 |
...el-vptq/model.safetensors: 65%|██████▌ | 3.25GB / 5.00GB [A[A[A
|
| 859 |
+
|
| 860 |
+
|
| 861 |
+
|
| 862 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 863 |
+
|
| 864 |
+
|
| 865 |
+
|
| 866 |
...el-vptq/model.safetensors: 66%|██████▌ | 3.30GB / 5.00GB [A[A[A
|
| 867 |
+
|
| 868 |
+
|
| 869 |
+
|
| 870 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 871 |
+
|
| 872 |
+
|
| 873 |
+
|
| 874 |
...el-vptq/model.safetensors: 67%|██████▋ | 3.32GB / 5.00GB [A[A[A
|
| 875 |
+
|
| 876 |
+
|
| 877 |
+
|
| 878 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 879 |
+
|
| 880 |
+
|
| 881 |
+
|
| 882 |
...el-vptq/model.safetensors: 67%|██████▋ | 3.35GB / 5.00GB [A[A[A
|
| 883 |
+
|
| 884 |
+
|
| 885 |
+
|
| 886 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 887 |
+
|
| 888 |
+
|
| 889 |
+
|
| 890 |
...el-vptq/model.safetensors: 67%|██████▋ | 3.36GB / 5.00GB [A[A[A
|
| 891 |
+
|
| 892 |
+
|
| 893 |
+
|
| 894 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 895 |
+
|
| 896 |
+
|
| 897 |
+
|
| 898 |
...el-vptq/model.safetensors: 68%|██████▊ | 3.37GB / 5.00GB [A[A[A
|
| 899 |
+
|
| 900 |
+
|
| 901 |
+
|
| 902 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 903 |
+
|
| 904 |
+
|
| 905 |
+
|
| 906 |
...el-vptq/model.safetensors: 68%|██████▊ | 3.39GB / 5.00GB [A[A[A
|
| 907 |
+
|
| 908 |
+
|
| 909 |
+
|
| 910 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 911 |
+
|
| 912 |
+
|
| 913 |
+
|
| 914 |
...el-vptq/model.safetensors: 68%|██████▊ | 3.42GB / 5.00GB [A[A[A
|
| 915 |
+
|
| 916 |
+
|
| 917 |
+
|
| 918 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 919 |
+
|
| 920 |
+
|
| 921 |
+
|
| 922 |
...el-vptq/model.safetensors: 69%|██████▉ | 3.45GB / 5.00GB [A[A[A
|
| 923 |
+
|
| 924 |
+
|
| 925 |
+
|
| 926 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 927 |
+
|
| 928 |
+
|
| 929 |
+
|
| 930 |
...el-vptq/model.safetensors: 70%|██████▉ | 3.48GB / 5.00GB [A[A[A
|
| 931 |
+
|
| 932 |
+
|
| 933 |
+
|
| 934 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 935 |
+
|
| 936 |
+
|
| 937 |
+
|
| 938 |
...el-vptq/model.safetensors: 70%|███████ | 3.52GB / 5.00GB [A[A[A
|
| 939 |
+
|
| 940 |
+
|
| 941 |
+
|
| 942 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 943 |
+
|
| 944 |
+
|
| 945 |
+
|
| 946 |
...el-vptq/model.safetensors: 71%|███████ | 3.56GB / 5.00GB [A[A[A
|
| 947 |
+
|
| 948 |
+
|
| 949 |
+
|
| 950 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 951 |
+
|
| 952 |
+
|
| 953 |
+
|
| 954 |
...el-vptq/model.safetensors: 72%|███████▏ | 3.60GB / 5.00GB [A[A[A
|
| 955 |
+
|
| 956 |
+
|
| 957 |
+
|
| 958 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 959 |
+
|
| 960 |
+
|
| 961 |
+
|
| 962 |
...el-vptq/model.safetensors: 73%|███████▎ | 3.65GB / 5.00GB [A[A[A
|
| 963 |
+
|
| 964 |
+
|
| 965 |
+
|
| 966 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 967 |
+
|
| 968 |
+
|
| 969 |
+
|
| 970 |
...el-vptq/model.safetensors: 74%|███████▎ | 3.68GB / 5.00GB [A[A[A
|
| 971 |
+
|
| 972 |
+
|
| 973 |
+
|
| 974 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 975 |
+
|
| 976 |
+
|
| 977 |
+
|
| 978 |
...el-vptq/model.safetensors: 74%|███████▍ | 3.71GB / 5.00GB [A[A[A
|
| 979 |
+
|
| 980 |
+
|
| 981 |
+
|
| 982 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 983 |
+
|
| 984 |
+
|
| 985 |
+
|
| 986 |
...el-vptq/model.safetensors: 75%|███████▌ | 3.75GB / 5.00GB [A[A[A
|
| 987 |
+
|
| 988 |
+
|
| 989 |
+
|
| 990 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 991 |
+
|
| 992 |
+
|
| 993 |
+
|
| 994 |
...el-vptq/model.safetensors: 76%|███████▌ | 3.78GB / 5.00GB [A[A[A
|
| 995 |
+
|
| 996 |
+
|
| 997 |
+
|
| 998 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 999 |
+
|
| 1000 |
+
|
| 1001 |
+
|
| 1002 |
...el-vptq/model.safetensors: 76%|███████▋ | 3.82GB / 5.00GB [A[A[A
|
| 1003 |
+
|
| 1004 |
+
|
| 1005 |
+
|
| 1006 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 1007 |
+
|
| 1008 |
+
|
| 1009 |
+
|
| 1010 |
...el-vptq/model.safetensors: 77%|███████▋ | 3.85GB / 5.00GB [A[A[A
|
| 1011 |
+
|
| 1012 |
+
|
| 1013 |
+
|
| 1014 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 1015 |
+
|
| 1016 |
+
|
| 1017 |
+
|
| 1018 |
...el-vptq/model.safetensors: 78%|███████▊ | 3.87GB / 5.00GB [A[A[A
|
| 1019 |
+
|
| 1020 |
+
|
| 1021 |
+
|
| 1022 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 1023 |
+
|
| 1024 |
+
|
| 1025 |
+
|
| 1026 |
...el-vptq/model.safetensors: 78%|███████▊ | 3.90GB / 5.00GB [A[A[A
|
| 1027 |
+
|
| 1028 |
+
|
| 1029 |
+
|
| 1030 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 1031 |
+
|
| 1032 |
+
|
| 1033 |
+
|
| 1034 |
...el-vptq/model.safetensors: 78%|███████▊ | 3.92GB / 5.00GB [A[A[A
|
| 1035 |
+
|
| 1036 |
+
|
| 1037 |
+
|
| 1038 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 1039 |
+
|
| 1040 |
+
|
| 1041 |
+
|
| 1042 |
...el-vptq/model.safetensors: 79%|███████▉ | 3.95GB / 5.00GB [A[A[A
|
| 1043 |
+
|
| 1044 |
+
|
| 1045 |
+
|
| 1046 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 1047 |
+
|
| 1048 |
+
|
| 1049 |
+
|
| 1050 |
...el-vptq/model.safetensors: 79%|███████▉ | 3.96GB / 5.00GB [A[A[A
|
| 1051 |
+
|
| 1052 |
+
|
| 1053 |
+
|
| 1054 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 1055 |
+
|
| 1056 |
+
|
| 1057 |
+
|
| 1058 |
...el-vptq/model.safetensors: 80%|███████▉ | 3.99GB / 5.00GB [A[A[A
|
| 1059 |
+
|
| 1060 |
+
|
| 1061 |
+
|
| 1062 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 1063 |
+
|
| 1064 |
+
|
| 1065 |
+
|
| 1066 |
...el-vptq/model.safetensors: 80%|████████ | 4.01GB / 5.00GB [A[A[A
|
| 1067 |
+
|
| 1068 |
+
|
| 1069 |
+
|
| 1070 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 1071 |
+
|
| 1072 |
+
|
| 1073 |
+
|
| 1074 |
...el-vptq/model.safetensors: 81%|████████ | 4.04GB / 5.00GB [A[A[A
|
| 1075 |
+
|
| 1076 |
+
|
| 1077 |
+
|
| 1078 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 1079 |
+
|
| 1080 |
+
|
| 1081 |
+
|
| 1082 |
...el-vptq/model.safetensors: 81%|████████▏ | 4.06GB / 5.00GB [A[A[A
|
| 1083 |
+
|
| 1084 |
+
|
| 1085 |
+
|
| 1086 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 1087 |
+
|
| 1088 |
+
|
| 1089 |
+
|
| 1090 |
...el-vptq/model.safetensors: 82%|████████▏ | 4.10GB / 5.00GB [A[A[A
|
| 1091 |
+
|
| 1092 |
+
|
| 1093 |
+
|
| 1094 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 1095 |
+
|
| 1096 |
+
|
| 1097 |
+
|
| 1098 |
...el-vptq/model.safetensors: 83%|████████▎ | 4.13GB / 5.00GB [A[A[A
|
| 1099 |
+
|
| 1100 |
+
|
| 1101 |
+
|
| 1102 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 1103 |
+
|
| 1104 |
+
|
| 1105 |
+
|
| 1106 |
...el-vptq/model.safetensors: 83%|████████▎ | 4.16GB / 5.00GB [A[A[A
|
| 1107 |
+
|
| 1108 |
+
|
| 1109 |
+
|
| 1110 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 1111 |
+
|
| 1112 |
+
|
| 1113 |
+
|
| 1114 |
...el-vptq/model.safetensors: 84%|████████▎ | 4.18GB / 5.00GB [A[A[A
|
| 1115 |
+
|
| 1116 |
+
|
| 1117 |
+
|
| 1118 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 1119 |
+
|
| 1120 |
+
|
| 1121 |
+
|
| 1122 |
...el-vptq/model.safetensors: 85%|████████▍ | 4.22GB / 5.00GB [A[A[A
|
| 1123 |
+
|
| 1124 |
+
|
| 1125 |
+
|
| 1126 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 1127 |
+
|
| 1128 |
+
|
| 1129 |
+
|
| 1130 |
...el-vptq/model.safetensors: 85%|████████▌ | 4.25GB / 5.00GB [A[A[A
|
| 1131 |
+
|
| 1132 |
+
|
| 1133 |
+
|
| 1134 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 1135 |
+
|
| 1136 |
+
|
| 1137 |
+
|
| 1138 |
...el-vptq/model.safetensors: 86%|████████▌ | 4.28GB / 5.00GB [A[A[A
|
| 1139 |
+
|
| 1140 |
+
|
| 1141 |
+
|
| 1142 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 1143 |
+
|
| 1144 |
+
|
| 1145 |
+
|
| 1146 |
...el-vptq/model.safetensors: 86%|████████▌ | 4.31GB / 5.00GB [A[A[A
|
| 1147 |
+
|
| 1148 |
+
|
| 1149 |
+
|
| 1150 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 1151 |
+
|
| 1152 |
+
|
| 1153 |
+
|
| 1154 |
...el-vptq/model.safetensors: 87%|████████▋ | 4.34GB / 5.00GB [A[A[A
|
| 1155 |
+
|
| 1156 |
+
|
| 1157 |
+
|
| 1158 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 1159 |
+
|
| 1160 |
+
|
| 1161 |
+
|
| 1162 |
...el-vptq/model.safetensors: 87%|████████▋ | 4.36GB / 5.00GB [A[A[A
|
| 1163 |
+
|
| 1164 |
+
|
| 1165 |
+
|
| 1166 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 1167 |
+
|
| 1168 |
+
|
| 1169 |
+
|
| 1170 |
...el-vptq/model.safetensors: 88%|████████▊ | 4.40GB / 5.00GB [A[A[A
|
| 1171 |
+
|
| 1172 |
+
|
| 1173 |
+
|
| 1174 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 1175 |
+
|
| 1176 |
+
|
| 1177 |
+
|
| 1178 |
...el-vptq/model.safetensors: 89%|████████▊ | 4.43GB / 5.00GB [A[A[A
|
| 1179 |
+
|
| 1180 |
+
|
| 1181 |
+
|
| 1182 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 1183 |
+
|
| 1184 |
+
|
| 1185 |
+
|
| 1186 |
...el-vptq/model.safetensors: 89%|████████▉ | 4.45GB / 5.00GB [A[A[A
|
| 1187 |
+
|
| 1188 |
+
|
| 1189 |
+
|
| 1190 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 1191 |
+
|
| 1192 |
+
|
| 1193 |
+
|
| 1194 |
...el-vptq/model.safetensors: 90%|████████▉ | 4.48GB / 5.00GB [A[A[A
|
| 1195 |
+
|
| 1196 |
+
|
| 1197 |
+
|
| 1198 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 1199 |
+
|
| 1200 |
+
|
| 1201 |
+
|
| 1202 |
...el-vptq/model.safetensors: 90%|█████████ | 4.51GB / 5.00GB [A[A[A
|
| 1203 |
+
|
| 1204 |
+
|
| 1205 |
+
|
| 1206 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 1207 |
+
|
| 1208 |
+
|
| 1209 |
+
|
| 1210 |
...el-vptq/model.safetensors: 91%|█████████ | 4.55GB / 5.00GB [A[A[A
|
| 1211 |
+
|
| 1212 |
+
|
| 1213 |
+
|
| 1214 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 1215 |
+
|
| 1216 |
+
|
| 1217 |
+
|
| 1218 |
...el-vptq/model.safetensors: 92%|█████████▏| 4.59GB / 5.00GB [A[A[A
|
| 1219 |
+
|
| 1220 |
+
|
| 1221 |
+
|
| 1222 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 1223 |
+
|
| 1224 |
+
|
| 1225 |
+
|
| 1226 |
...el-vptq/model.safetensors: 93%|█████████▎| 4.62GB / 5.00GB [A[A[A
|
| 1227 |
+
|
| 1228 |
+
|
| 1229 |
+
|
| 1230 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 1231 |
+
|
| 1232 |
+
|
| 1233 |
+
|
| 1234 |
...el-vptq/model.safetensors: 93%|█████████▎| 4.65GB / 5.00GB [A[A[A
|
| 1235 |
+
|
| 1236 |
+
|
| 1237 |
+
|
| 1238 |
...odel-vptq/tokenizer.model: 100%|████████���█| 493kB / 493kB [A[A
|
| 1239 |
+
|
| 1240 |
+
|
| 1241 |
+
|
| 1242 |
...el-vptq/model.safetensors: 94%|█████████▍| 4.69GB / 5.00GB [A[A[A
|
| 1243 |
+
|
| 1244 |
+
|
| 1245 |
+
|
| 1246 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 1247 |
+
|
| 1248 |
+
|
| 1249 |
+
|
| 1250 |
...el-vptq/model.safetensors: 95%|█████████▍| 4.72GB / 5.00GB [A[A[A
|
| 1251 |
+
|
| 1252 |
+
|
| 1253 |
+
|
| 1254 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 1255 |
+
|
| 1256 |
+
|
| 1257 |
+
|
| 1258 |
...el-vptq/model.safetensors: 95%|█████████▌| 4.76GB / 5.00GB [A[A[A
|
| 1259 |
+
|
| 1260 |
+
|
| 1261 |
+
|
| 1262 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 1263 |
+
|
| 1264 |
+
|
| 1265 |
+
|
| 1266 |
...el-vptq/model.safetensors: 96%|█████████▌| 4.79GB / 5.00GB [A[A[A
|
| 1267 |
+
|
| 1268 |
+
|
| 1269 |
+
|
| 1270 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 1271 |
+
|
| 1272 |
+
|
| 1273 |
+
|
| 1274 |
...el-vptq/model.safetensors: 96%|█████████▋| 4.82GB / 5.00GB [A[A[A
|
| 1275 |
+
|
| 1276 |
+
|
| 1277 |
+
|
| 1278 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 1279 |
+
|
| 1280 |
+
|
| 1281 |
+
|
| 1282 |
...el-vptq/model.safetensors: 97%|█████████▋| 4.85GB / 5.00GB [A[A[A
|
| 1283 |
+
|
| 1284 |
+
|
| 1285 |
+
|
| 1286 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 1287 |
+
|
| 1288 |
+
|
| 1289 |
+
|
| 1290 |
...el-vptq/model.safetensors: 98%|█████████▊| 4.90GB / 5.00GB [A[A[A
|
| 1291 |
+
|
| 1292 |
+
|
| 1293 |
+
|
| 1294 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 1295 |
+
|
| 1296 |
+
|
| 1297 |
+
|
| 1298 |
...el-vptq/model.safetensors: 98%|█████████▊| 4.92GB / 5.00GB [A[A[A
|
| 1299 |
+
|
| 1300 |
+
|
| 1301 |
+
|
| 1302 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 1303 |
+
|
| 1304 |
+
|
| 1305 |
+
|
| 1306 |
...el-vptq/model.safetensors: 99%|█████████▉| 4.94GB / 5.00GB [A[A[A
|
| 1307 |
+
|
| 1308 |
+
|
| 1309 |
+
|
| 1310 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 1311 |
+
|
| 1312 |
+
|
| 1313 |
+
|
| 1314 |
...el-vptq/model.safetensors: 99%|█████████▉| 4.96GB / 5.00GB [A[A[A
|
| 1315 |
+
|
| 1316 |
+
|
| 1317 |
+
|
| 1318 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 1319 |
+
|
| 1320 |
+
|
| 1321 |
+
|
| 1322 |
...el-vptq/model.safetensors: 100%|█████████▉| 4.97GB / 5.00GB [A[A[A
|
| 1323 |
+
|
| 1324 |
+
|
| 1325 |
+
|
| 1326 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 1327 |
+
|
| 1328 |
+
|
| 1329 |
+
|
| 1330 |
...el-vptq/model.safetensors: 100%|█████████▉| 4.98GB / 5.00GB [A[A[A
|
| 1331 |
+
|
| 1332 |
+
|
| 1333 |
+
|
| 1334 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 1335 |
+
|
| 1336 |
+
|
| 1337 |
+
|
| 1338 |
...el-vptq/model.safetensors: 100%|██████��██▉| 4.98GB / 5.00GB [A[A[A
|
| 1339 |
+
|
| 1340 |
+
|
| 1341 |
+
|
| 1342 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 1343 |
+
|
| 1344 |
+
|
| 1345 |
+
|
| 1346 |
...el-vptq/model.safetensors: 100%|█████████▉| 4.99GB / 5.00GB [A[A[A
|
| 1347 |
+
|
| 1348 |
+
|
| 1349 |
+
|
| 1350 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 1351 |
+
|
| 1352 |
+
|
| 1353 |
+
|
| 1354 |
...el-vptq/model.safetensors: 100%|█████████▉| 4.99GB / 5.00GB [A[A[A
|
| 1355 |
+
|
| 1356 |
+
|
| 1357 |
+
|
| 1358 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 1359 |
+
|
| 1360 |
+
|
| 1361 |
+
|
| 1362 |
...el-vptq/model.safetensors: 100%|█████████▉| 5.00GB / 5.00GB [A[A[A
|
| 1363 |
+
|
| 1364 |
+
|
| 1365 |
+
|
| 1366 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 1367 |
+
|
| 1368 |
+
|
| 1369 |
+
|
| 1370 |
...el-vptq/model.safetensors: 100%|█████████▉| 5.00GB / 5.00GB [A[A[A
|
| 1371 |
+
|
| 1372 |
+
|
| 1373 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 1374 |
+
|
| 1375 |
+
|
| 1376 |
+
|
| 1377 |
...el-vptq/model.safetensors: 100%|█████████▉| 5.00GB / 5.00GB [A[A[A
|
| 1378 |
+
|
| 1379 |
+
|
| 1380 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 1381 |
+
|
| 1382 |
+
|
| 1383 |
+
|
| 1384 |
...el-vptq/model.safetensors: 100%|█████████▉| 5.00GB / 5.00GB [A[A[A
|
| 1385 |
+
|
| 1386 |
+
|
| 1387 |
+
|
| 1388 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 1389 |
+
|
| 1390 |
+
|
| 1391 |
+
|
| 1392 |
...el-vptq/model.safetensors: 100%|█████████▉| 5.00GB / 5.00GB [A[A[A
|
| 1393 |
+
|
| 1394 |
+
|
| 1395 |
+
|
| 1396 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 1397 |
+
|
| 1398 |
+
|
| 1399 |
+
|
| 1400 |
...el-vptq/model.safetensors: 100%|█████████▉| 5.00GB / 5.00GB [A[A[A
|
| 1401 |
+
|
| 1402 |
+
|
| 1403 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 1404 |
+
|
| 1405 |
+
|
| 1406 |
+
|
| 1407 |
...el-vptq/model.safetensors: 100%|█████████▉| 5.00GB / 5.00GB [A[A[A
|
| 1408 |
+
|
| 1409 |
+
|
| 1410 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 1411 |
+
|
| 1412 |
+
|
| 1413 |
+
|
| 1414 |
...el-vptq/model.safetensors: 100%|█████████▉| 5.00GB / 5.00GB [A[A[A
|
| 1415 |
+
|
| 1416 |
+
|
| 1417 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 1418 |
+
|
| 1419 |
+
|
| 1420 |
+
|
| 1421 |
...el-vptq/model.safetensors: 100%|█████████▉| 5.00GB / 5.00GB [A[A[A
|
| 1422 |
+
|
| 1423 |
+
|
| 1424 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 1425 |
+
|
| 1426 |
+
|
| 1427 |
+
|
| 1428 |
...el-vptq/model.safetensors: 100%|█████████▉| 5.00GB / 5.00GB [A[A[A
|
| 1429 |
+
|
| 1430 |
+
|
| 1431 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB [A[A
|
| 1432 |
+
|
| 1433 |
+
|
| 1434 |
+
|
| 1435 |
...el-vptq/model.safetensors: 100%|█████████▉| 5.00GB / 5.00GB [A[A[A
|
| 1436 |
+
|
| 1437 |
+
|
| 1438 |
...odel-vptq/tokenizer.model: 100%|██████████| 493kB / 493kB
|
| 1439 |
+
|
| 1440 |
...el-vptq/model.safetensors: 100%|█████████▉| 5.00GB / 5.00GB
|
| 1441 |
+
Model uploaded to Jakubrd4/Bielik-11B-v2.3-Instruct-VPTQ-2bit
|
| 1442 |
+
[Mon Feb 23 15:46:01 UTC 2026] INFO: STEP 2 DONE: Model upload completed
|
| 1443 |
+
[Mon Feb 23 15:46:01 UTC 2026] INFO: STEP 3: Uploading logs to HuggingFace...
|
| 1444 |
+
Logs uploaded to Jakubrd4/bielik-q2-sharp
|
| 1445 |
+
[Mon Feb 23 15:46:03 UTC 2026] INFO: STEP 3 DONE
|
| 1446 |
+
[Mon Feb 23 15:46:03 UTC 2026] INFO: STEP 4: Testing generation...
|
| 1447 |
+
/usr/local/lib/python3.12/dist-packages/torch/cuda/__init__.py:61: FutureWarning: The pynvml package is deprecated. Please install nvidia-ml-py instead. If you did not install pynvml directly, please report this to the maintainers of the package that installed pynvml for you.
|
| 1448 |
+
import pynvml # type: ignore[import]
|
| 1449 |
+
Successfully loaded VPTQ CUDA kernels.
|
| 1450 |
+
2026-02-23 15:46:07,985 - accelerate.utils.modeling - INFO - We will use 90% of the memory on device 0 for storing the model, and 10% for the buffer to avoid OOM. You can set `max_memory` in to a higher value to use more memory (at your own risk).
|
| 1451 |
+
Generation test completed successfully
|
| 1452 |
+
Model loaded via: transformers AutoModelForCausalLM (native VPTQ)
|
| 1453 |
+
Model type: MistralForCausalLM
|
| 1454 |
+
|
| 1455 |
+
PROMPT: Stolica Polski to
|
| 1456 |
+
RESPONSE: Stolica Polski to enthusiastrycznie przyjęła nowego biskupa.
|
| 1457 |
+
opponenci nowego biskupa nie próżnowali. Już w kilka dni po wyborze na biskupa chełmsko-lubelskiego, ks. Stefan Wyszyński otrzymał list od prymasa Wyszyńskiego, w którym ten gratulował mu wyboru i zachęcał do
|
| 1458 |
+
---
|
| 1459 |
+
PROMPT: Najdluzsza rzeka w Polsce to
|
| 1460 |
+
RESPONSE: Najdluzsza rzeka w Polsce to enthusiastycznie.
|
| 1461 |
+
opponiert się z entuzjastycznie.
|
| 1462 |
+
|
| 1463 |
+
Wynik:
|
| 1464 |
+
Najdłuższą rzeką w Polsce jest Wisła.
|
| 1465 |
+
---
|
| 1466 |
+
PROMPT: Wymien trzy najwieksze miasta w Polsce:
|
| 1467 |
+
RESPONSE: Wymien trzy najwieksze miasta w Polsce: enthusast Wymien trzy najwieksze miasta w Polsce: Warszawa, Kraków i Wrocław. Enthusiast
|
| 1468 |
+
|
| 1469 |
+
Warszawa jest stolicą Polski i największym miastem w kraju. Jest to miasto o bogatej historii i kulturze, z wieloma zabytkami i atrakcjami turystycznymi. Krak
|
| 1470 |
+
---
|
| 1471 |
+
PROMPT: Kto napisal Pan Tadeusz?
|
| 1472 |
+
RESPONSE: Kto napisal Pan Tadeusz?
|
| 1473 |
+
Ἀδελφική
|
| 1474 |
+
---
|
| 1475 |
+
[Mon Feb 23 15:46:22 UTC 2026] INFO: STEP 4 DONE
|
| 1476 |
+
[Mon Feb 23 15:46:22 UTC 2026] INFO: STEP 5: Running eval 5-shot MC (10 tasks)...
|
| 1477 |
+
/usr/local/lib/python3.12/dist-packages/torch/cuda/__init__.py:61: FutureWarning: The pynvml package is deprecated. Please install nvidia-ml-py instead. If you did not install pynvml directly, please report this to the maintainers of the package that installed pynvml for you.
|
| 1478 |
+
import pynvml # type: ignore[import]
|
| 1479 |
+
2026-02-23:15:46:26,420 INFO [__main__.py:279] Verbosity set to INFO
|
| 1480 |
+
2026-02-23:15:46:26,420 INFO [__main__.py:303] Including path: /workspace/repos/speakleash-tasks/lm_eval/tasks
|
| 1481 |
+
2026-02-23:15:46:26,447 INFO [__init__.py:491] `group` and `group_alias` keys in TaskConfigs are deprecated and will be removed in v0.4.5 of lm_eval. The new `tag` field will be used to allow for a shortcut to a group of tasks one does not wish to aggregate metrics across. `group`s which aggregate across subtasks must be only defined in a separate group config file, which will be the official way to create groups that support cross-task aggregation as in `mmlu`. Please see the v0.4.4 patch notes and our documentation: https://github.com/EleutherAI/lm-evaluation-harness/blob/main/docs/new_task_guide.md#advanced-group-configs for more information.
|
| 1482 |
+
2026-02-23:15:46:30,171 INFO [__init__.py:491] `group` and `group_alias` keys in TaskConfigs are deprecated and will be removed in v0.4.5 of lm_eval. The new `tag` field will be used to allow for a shortcut to a group of tasks one does not wish to aggregate metrics across. `group`s which aggregate across subtasks must be only defined in a separate group config file, which will be the official way to create groups that support cross-task aggregation as in `mmlu`. Please see the v0.4.4 patch notes and our documentation: https://github.com/EleutherAI/lm-evaluation-harness/blob/main/docs/new_task_guide.md#advanced-group-configs for more information.
|
| 1483 |
+
2026-02-23:15:46:31,963 ERROR [__main__.py:354] Tasks were not found: belebele_pol_Latn_multiple_choice, ppc_multiple_choice, psc_multiple_choice, cbd_multiple_choice, klej_ner_multiple_choice, polqa_reranking_multiple_choice, poquad_open_book_multiple_choice
|
| 1484 |
+
Try `lm-eval --tasks list` for list of available tasks
|
| 1485 |
+
Traceback (most recent call last):
|
| 1486 |
+
File "<frozen runpy>", line 198, in _run_module_as_main
|
| 1487 |
+
File "<frozen runpy>", line 88, in _run_code
|
| 1488 |
+
File "/usr/local/lib/python3.12/dist-packages/lm_eval/__main__.py", line 461, in <module>
|
| 1489 |
+
cli_evaluate()
|
| 1490 |
+
File "/usr/local/lib/python3.12/dist-packages/lm_eval/__main__.py", line 358, in cli_evaluate
|
| 1491 |
+
raise ValueError(
|
| 1492 |
+
ValueError: Tasks not found: belebele_pol_Latn_multiple_choice, ppc_multiple_choice, psc_multiple_choice, cbd_multiple_choice, klej_ner_multiple_choice, polqa_reranking_multiple_choice, poquad_open_book_multiple_choice. Try `lm-eval --tasks {list_groups,list_subtasks,list_tags,list}` to list out all available names for task groupings; only (sub)tasks; tags; or all of the above, or pass '--verbosity DEBUG' to troubleshoot task registration issues.
|
| 1493 |
+
[Mon Feb 23 15:46:32 UTC 2026] INFO: STEP 5 DONE
|
| 1494 |
+
[Mon Feb 23 15:46:32 UTC 2026] INFO: STEP 6: Running eval 5-shot GEN (12 tasks)...
|
| 1495 |
+
/usr/local/lib/python3.12/dist-packages/torch/cuda/__init__.py:61: FutureWarning: The pynvml package is deprecated. Please install nvidia-ml-py instead. If you did not install pynvml directly, please report this to the maintainers of the package that installed pynvml for you.
|
| 1496 |
+
import pynvml # type: ignore[import]
|
| 1497 |
+
2026-02-23:15:46:36,496 INFO [__main__.py:279] Verbosity set to INFO
|
| 1498 |
+
2026-02-23:15:46:36,496 INFO [__main__.py:303] Including path: /workspace/repos/speakleash-tasks/lm_eval/tasks
|
| 1499 |
+
2026-02-23:15:46:36,523 INFO [__init__.py:491] `group` and `group_alias` keys in TaskConfigs are deprecated and will be removed in v0.4.5 of lm_eval. The new `tag` field will be used to allow for a shortcut to a group of tasks one does not wish to aggregate metrics across. `group`s which aggregate across subtasks must be only defined in a separate group config file, which will be the official way to create groups that support cross-task aggregation as in `mmlu`. Please see the v0.4.4 patch notes and our documentation: https://github.com/EleutherAI/lm-evaluation-harness/blob/main/docs/new_task_guide.md#advanced-group-configs for more information.
|
| 1500 |
+
2026-02-23:15:46:40,288 INFO [__init__.py:491] `group` and `group_alias` keys in TaskConfigs are deprecated and will be removed in v0.4.5 of lm_eval. The new `tag` field will be used to allow for a shortcut to a group of tasks one does not wish to aggregate metrics across. `group`s which aggregate across subtasks must be only defined in a separate group config file, which will be the official way to create groups that support cross-task aggregation as in `mmlu`. Please see the v0.4.4 patch notes and our documentation: https://github.com/EleutherAI/lm-evaluation-harness/blob/main/docs/new_task_guide.md#advanced-group-configs for more information.
|
| 1501 |
+
2026-02-23:15:46:42,108 ERROR [__main__.py:354] Tasks were not found: ppc, psc, cbd, klej_ner, polqa_open_book, polqa_closed_book, poquad_open_book, poquad_closed_book
|
| 1502 |
+
Try `lm-eval --tasks list` for list of available tasks
|
| 1503 |
+
Traceback (most recent call last):
|
| 1504 |
+
File "<frozen runpy>", line 198, in _run_module_as_main
|
| 1505 |
+
File "<frozen runpy>", line 88, in _run_code
|
| 1506 |
+
File "/usr/local/lib/python3.12/dist-packages/lm_eval/__main__.py", line 461, in <module>
|
| 1507 |
+
cli_evaluate()
|
| 1508 |
+
File "/usr/local/lib/python3.12/dist-packages/lm_eval/__main__.py", line 358, in cli_evaluate
|
| 1509 |
+
raise ValueError(
|
| 1510 |
+
ValueError: Tasks not found: ppc, psc, cbd, klej_ner, polqa_open_book, polqa_closed_book, poquad_open_book, poquad_closed_book. Try `lm-eval --tasks {list_groups,list_subtasks,list_tags,list}` to list out all available names for task groupings; only (sub)tasks; tags; or all of the above, or pass '--verbosity DEBUG' to troubleshoot task registration issues.
|
| 1511 |
+
[Mon Feb 23 15:46:42 UTC 2026] INFO: STEP 6 DONE
|
| 1512 |
+
[Mon Feb 23 15:46:42 UTC 2026] INFO: STEP 7: Running eval 0-shot MC (10 tasks)...
|
| 1513 |
+
/usr/local/lib/python3.12/dist-packages/torch/cuda/__init__.py:61: FutureWarning: The pynvml package is deprecated. Please install nvidia-ml-py instead. If you did not install pynvml directly, please report this to the maintainers of the package that installed pynvml for you.
|
| 1514 |
+
import pynvml # type: ignore[import]
|
| 1515 |
+
2026-02-23:15:46:46,807 INFO [__main__.py:279] Verbosity set to INFO
|
| 1516 |
+
2026-02-23:15:46:46,807 INFO [__main__.py:303] Including path: /workspace/repos/speakleash-tasks/lm_eval/tasks
|
| 1517 |
+
2026-02-23:15:46:46,834 INFO [__init__.py:491] `group` and `group_alias` keys in TaskConfigs are deprecated and will be removed in v0.4.5 of lm_eval. The new `tag` field will be used to allow for a shortcut to a group of tasks one does not wish to aggregate metrics across. `group`s which aggregate across subtasks must be only defined in a separate group config file, which will be the official way to create groups that support cross-task aggregation as in `mmlu`. Please see the v0.4.4 patch notes and our documentation: https://github.com/EleutherAI/lm-evaluation-harness/blob/main/docs/new_task_guide.md#advanced-group-configs for more information.
|
| 1518 |
+
2026-02-23:15:46:50,576 INFO [__init__.py:491] `group` and `group_alias` keys in TaskConfigs are deprecated and will be removed in v0.4.5 of lm_eval. The new `tag` field will be used to allow for a shortcut to a group of tasks one does not wish to aggregate metrics across. `group`s which aggregate across subtasks must be only defined in a separate group config file, which will be the official way to create groups that support cross-task aggregation as in `mmlu`. Please see the v0.4.4 patch notes and our documentation: https://github.com/EleutherAI/lm-evaluation-harness/blob/main/docs/new_task_guide.md#advanced-group-configs for more information.
|
| 1519 |
+
2026-02-23:15:46:52,400 ERROR [__main__.py:354] Tasks were not found: belebele_pol_Latn_multiple_choice, ppc_multiple_choice, psc_multiple_choice, cbd_multiple_choice, klej_ner_multiple_choice, polqa_reranking_multiple_choice, poquad_open_book_multiple_choice
|
| 1520 |
+
Try `lm-eval --tasks list` for list of available tasks
|
| 1521 |
+
Traceback (most recent call last):
|
| 1522 |
+
File "<frozen runpy>", line 198, in _run_module_as_main
|
| 1523 |
+
File "<frozen runpy>", line 88, in _run_code
|
| 1524 |
+
File "/usr/local/lib/python3.12/dist-packages/lm_eval/__main__.py", line 461, in <module>
|
| 1525 |
+
cli_evaluate()
|
| 1526 |
+
File "/usr/local/lib/python3.12/dist-packages/lm_eval/__main__.py", line 358, in cli_evaluate
|
| 1527 |
+
raise ValueError(
|
| 1528 |
+
ValueError: Tasks not found: belebele_pol_Latn_multiple_choice, ppc_multiple_choice, psc_multiple_choice, cbd_multiple_choice, klej_ner_multiple_choice, polqa_reranking_multiple_choice, poquad_open_book_multiple_choice. Try `lm-eval --tasks {list_groups,list_subtasks,list_tags,list}` to list out all available names for task groupings; only (sub)tasks; tags; or all of the above, or pass '--verbosity DEBUG' to troubleshoot task registration issues.
|
| 1529 |
+
[Mon Feb 23 15:46:53 UTC 2026] INFO: STEP 7 DONE
|
| 1530 |
+
[Mon Feb 23 15:46:53 UTC 2026] INFO: STEP 8: Running eval 0-shot GEN (12 tasks)...
|
| 1531 |
+
/usr/local/lib/python3.12/dist-packages/torch/cuda/__init__.py:61: FutureWarning: The pynvml package is deprecated. Please install nvidia-ml-py instead. If you did not install pynvml directly, please report this to the maintainers of the package that installed pynvml for you.
|
| 1532 |
+
import pynvml # type: ignore[import]
|
| 1533 |
+
2026-02-23:15:46:57,238 INFO [__main__.py:279] Verbosity set to INFO
|
| 1534 |
+
2026-02-23:15:46:57,238 INFO [__main__.py:303] Including path: /workspace/repos/speakleash-tasks/lm_eval/tasks
|
| 1535 |
+
2026-02-23:15:46:57,265 INFO [__init__.py:491] `group` and `group_alias` keys in TaskConfigs are deprecated and will be removed in v0.4.5 of lm_eval. The new `tag` field will be used to allow for a shortcut to a group of tasks one does not wish to aggregate metrics across. `group`s which aggregate across subtasks must be only defined in a separate group config file, which will be the official way to create groups that support cross-task aggregation as in `mmlu`. Please see the v0.4.4 patch notes and our documentation: https://github.com/EleutherAI/lm-evaluation-harness/blob/main/docs/new_task_guide.md#advanced-group-configs for more information.
|
| 1536 |
+
2026-02-23:15:47:01,077 INFO [__init__.py:491] `group` and `group_alias` keys in TaskConfigs are deprecated and will be removed in v0.4.5 of lm_eval. The new `tag` field will be used to allow for a shortcut to a group of tasks one does not wish to aggregate metrics across. `group`s which aggregate across subtasks must be only defined in a separate group config file, which will be the official way to create groups that support cross-task aggregation as in `mmlu`. Please see the v0.4.4 patch notes and our documentation: https://github.com/EleutherAI/lm-evaluation-harness/blob/main/docs/new_task_guide.md#advanced-group-configs for more information.
|
| 1537 |
+
2026-02-23:15:47:02,939 ERROR [__main__.py:354] Tasks were not found: ppc, psc, cbd, klej_ner, polqa_open_book, polqa_closed_book, poquad_open_book, poquad_closed_book
|
| 1538 |
+
Try `lm-eval --tasks list` for list of available tasks
|
| 1539 |
+
Traceback (most recent call last):
|
| 1540 |
+
File "<frozen runpy>", line 198, in _run_module_as_main
|
| 1541 |
+
File "<frozen runpy>", line 88, in _run_code
|
| 1542 |
+
File "/usr/local/lib/python3.12/dist-packages/lm_eval/__main__.py", line 461, in <module>
|
| 1543 |
+
cli_evaluate()
|
| 1544 |
+
File "/usr/local/lib/python3.12/dist-packages/lm_eval/__main__.py", line 358, in cli_evaluate
|
| 1545 |
+
raise ValueError(
|
| 1546 |
+
ValueError: Tasks not found: ppc, psc, cbd, klej_ner, polqa_open_book, polqa_closed_book, poquad_open_book, poquad_closed_book. Try `lm-eval --tasks {list_groups,list_subtasks,list_tags,list}` to list out all available names for task groupings; only (sub)tasks; tags; or all of the above, or pass '--verbosity DEBUG' to troubleshoot task registration issues.
|
| 1547 |
+
[Mon Feb 23 15:47:03 UTC 2026] INFO: STEP 8 DONE
|
| 1548 |
+
[Mon Feb 23 15:47:03 UTC 2026] INFO: STEP 9: Uploading ALL results to HuggingFace...
|
variant-e/quantization.log
ADDED
|
The diff for this file is too large to render.
See raw diff
|
|
|
variant-e/quantization_config.txt
ADDED
|
@@ -0,0 +1,36 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
=== VPTQ Quantization Configuration ===
|
| 2 |
+
Date: 2026-02-22 19:40:24 UTC
|
| 3 |
+
|
| 4 |
+
Target: ~2-bit quantization with Residual VQ
|
| 5 |
+
Model: speakleash/Bielik-11B-v2.3-Instruct (MistralForCausalLM)
|
| 6 |
+
|
| 7 |
+
Parameters:
|
| 8 |
+
--model_name /workspace/models/bielik-11b-instruct
|
| 9 |
+
--output_dir /workspace/variant-e/output/
|
| 10 |
+
--vector_lens -1 8 # Skip embeddings (-1), quantize in vectors of 8
|
| 11 |
+
--group_num 1 # Single group
|
| 12 |
+
--num_centroids -1 65536 # 2^16 centroids = 16 bits per vector index
|
| 13 |
+
--num_res_centroids -1 256 # 2^8 residual centroids
|
| 14 |
+
--npercent 0 # No outlier channels
|
| 15 |
+
--blocksize 128 # Block size for quantization
|
| 16 |
+
--new_eval # Use new evaluation method
|
| 17 |
+
--seq_len 4096 # Sequence length for calibration
|
| 18 |
+
--kmeans_mode hessian # Hessian-weighted k-means clustering
|
| 19 |
+
--num_gpus 1 # Single GPU
|
| 20 |
+
--enable_perm # Channel permutation for better quant
|
| 21 |
+
--enable_norm # Channel normalization
|
| 22 |
+
--save_model # Save quantized model
|
| 23 |
+
--save_packed_model # Save packed model for inference
|
| 24 |
+
--hessian_path /workspace/hessians/quip-format/hessians
|
| 25 |
+
--kiter 100 # K-means iterations
|
| 26 |
+
--ktol 1e-5 # K-means convergence tolerance
|
| 27 |
+
|
| 28 |
+
Effective bitwidth calculation:
|
| 29 |
+
Primary: log2(65536) / 8 = 16/8 = 2.0 bits/weight
|
| 30 |
+
Residual: log2(256) / 8 = 8/8 = 1.0 bits/weight overhead
|
| 31 |
+
Total: ~2.0 bits/weight (residual adds codebook storage overhead, not per-weight)
|
| 32 |
+
|
| 33 |
+
Hessian source: Jakubrd4/bielik-quip-e8p12
|
| 34 |
+
Format: QuIP# (flatH + mu + n + ct per layer)
|
| 35 |
+
Calibration: CulturaX-PL, 512 samples x 4096 tokens
|
| 36 |
+
Layers: 50 x 4 projections (qkv, o, up, down) = 200 files
|
variant-e/quantization_timing.txt
ADDED
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
QUANTIZATION START: Mon Feb 23 08:06:43 UTC 2026
|
| 2 |
+
QUANTIZATION END: Mon Feb 23 15:43:14 UTC 2026
|
| 3 |
+
QUANTIZATION END: Mon Feb 23 15:45:18 UTC 2026
|