Delete logs/convert_none.log with huggingface_hub
Browse files- logs/convert_none.log +0 -60
logs/convert_none.log
DELETED
|
@@ -1,60 +0,0 @@
|
|
| 1 |
-
[!] HF_TOKEN is not set. If model is public, you can pass --allow-no-token.
|
| 2 |
-
[+] Using existing model dir: /home/ubuntu/translategemma-4b-it
|
| 3 |
-
[+] ARCH: {'model_type': 'gemma3', 'architecture': 'Gemma3ForConditionalGeneration', 'vocab_size': 262208}
|
| 4 |
-
[+] Strategy 1: litert-torch native
|
| 5 |
-
[+] Gemma3 builders available: ['build_model_1b', 'build_model_270m']
|
| 6 |
-
[+] Trying litert_torch.generative.examples.gemma3.gemma3.build_model_1b ...
|
| 7 |
-
[!] failed: 'NoneType' object has no attribute 'eval'
|
| 8 |
-
[+] Trying litert_torch.generative.examples.gemma3.gemma3.build_model_270m ...
|
| 9 |
-
[!] failed: 'NoneType' object has no attribute 'eval'
|
| 10 |
-
[!] Strategy 1 did not find a compatible builder
|
| 11 |
-
[+] Strategy 2: ai_edge_torch generic (wrapped logits-only)
|
| 12 |
-
[+] Loading HF model on CPU with dtype=torch.float32 ...
|
| 13 |
-
`torch_dtype` is deprecated! Use `dtype` instead!
|
| 14 |
-
|
| 15 |
-
Loading weights: 0%| | 0/883 [00:00<?, ?it/s]
|
| 16 |
-
Loading weights: 8%|ββββββββ | 67/883 [00:00<00:01, 498.07it/s]
|
| 17 |
-
Loading weights: 13%|ββββββββββββββ | 117/883 [00:00<00:01, 440.12it/s]
|
| 18 |
-
Loading weights: 18%|ββββββββββββββββββββ | 161/883 [00:00<00:02, 330.40it/s]
|
| 19 |
-
Loading weights: 22%|ββββββββββββββββββββββββ | 198/883 [00:00<00:02, 337.14it/s]
|
| 20 |
-
Loading weights: 27%|βββββββββββββββββββββββββββββ | 237/883 [00:00<00:01, 340.87it/s]
|
| 21 |
-
Loading weights: 31%|βββββββββββββββββββββββββββββββββ | 272/883 [00:00<00:01, 340.67it/s]
|
| 22 |
-
Loading weights: 35%|βββββββββββββββββββββββββββββββββββββ | 307/883 [00:00<00:01, 342.47it/s]
|
| 23 |
-
Loading weights: 39%|βββββββββββββββββββββββββββββββββββββββββ | 342/883 [00:00<00:01, 334.91it/s]
|
| 24 |
-
Loading weights: 43%|ββββββββββββββββββββββββββββββββββββββββββββββ | 376/883 [00:01<00:01, 288.99it/s]
|
| 25 |
-
Loading weights: 47%|ββββββββββββββββββββββββββββββββββββββββββββββββββ | 412/883 [00:01<00:01, 307.09it/s]
|
| 26 |
-
Loading weights: 54%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 473/883 [00:01<00:01, 388.67it/s]
|
| 27 |
-
Loading weights: 86%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 758/883 [00:01<00:00, 1076.01it/s]
|
| 28 |
-
Loading weights: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 883/883 [00:01<00:00, 597.78it/s]
|
| 29 |
-
(00:00) [START] LiteRT-Torch Convert
|
| 30 |
-
(00:00) [START] LiteRT-Torch Convert > Torch Export: serving_default
|
| 31 |
-
(00:08) [START] LiteRT-Torch Convert > Torch Export: serving_default > ExportedProgram Run Decompositions
|
| 32 |
-
(00:25) [ DONE] LiteRT-Torch Convert > Torch Export: serving_default > ExportedProgram Run Decompositions (+00:16)
|
| 33 |
-
(00:25) [ DONE] LiteRT-Torch Convert > Torch Export: serving_default (+00:25)
|
| 34 |
-
(00:25) [START] LiteRT-Torch Convert > Run FX Passes
|
| 35 |
-
(00:26) [START] LiteRT-Torch Convert > Run FX Passes > ExportedProgram Run Decompositions
|
| 36 |
-
(00:26) [ DONE] LiteRT-Torch Convert > Run FX Passes > ExportedProgram Run Decompositions (+00:00)
|
| 37 |
-
(00:26) [ DONE] LiteRT-Torch Convert > Run FX Passes (+00:01)
|
| 38 |
-
(00:26) [START] LiteRT-Torch Convert > Lower to MLIR: serving_default
|
| 39 |
-
(00:26) [START] LiteRT-Torch Convert > Lower to MLIR: serving_default > ExportedProgram Run Decompositions
|
| 40 |
-
(00:42) [ DONE] LiteRT-Torch Convert > Lower to MLIR: serving_default > ExportedProgram Run Decompositions (+00:15)
|
| 41 |
-
(00:42) [START] LiteRT-Torch Convert > Lower to MLIR: serving_default > ExportedProgram Run Decompositions
|
| 42 |
-
(00:59) [ DONE] LiteRT-Torch Convert > Lower to MLIR: serving_default > ExportedProgram Run Decompositions (+00:16)
|
| 43 |
-
(00:59) [START] LiteRT-Torch Convert > Lower to MLIR: serving_default > Create MLIR Module
|
| 44 |
-
WARNING:jax._src.xla_bridge:An NVIDIA GPU may be present on this machine, but a CUDA-enabled jaxlib is not installed. Falling back to cpu.
|
| 45 |
-
(01:32) [ DONE] LiteRT-Torch Convert > Lower to MLIR: serving_default > Create MLIR Module (+00:32)
|
| 46 |
-
(01:32) [ DONE] LiteRT-Torch Convert > Lower to MLIR: serving_default (+01:05)
|
| 47 |
-
(01:32) [START] LiteRT-Torch Convert > Merge MLIR Modules
|
| 48 |
-
(01:32) [ DONE] LiteRT-Torch Convert > Merge MLIR Modules (+00:00)
|
| 49 |
-
(01:32) [START] LiteRT-Torch Convert > Run LiteRT Converter Passes
|
| 50 |
-
(07:03) [ DONE] LiteRT-Torch Convert > Run LiteRT Converter Passes (+05:31)
|
| 51 |
-
(07:03) [ DONE] LiteRT-Torch Convert (+07:03)
|
| 52 |
-
(00:00) [START] Write Model to /home/ubuntu/tflite_output/none/translategemma-4b-it-generic-none.tflite
|
| 53 |
-
Module size is greater than 2GB
|
| 54 |
-
(00:07) [ DONE] Write Model to /home/ubuntu/tflite_output/none/translategemma-4b-it-generic-none.tflite (+00:07)
|
| 55 |
-
[+] Strategy 2 success: /home/ubuntu/tflite_output/none/translategemma-4b-it-generic-none.tflite
|
| 56 |
-
[!] Generic TFLite may not have MediaPipe LLM prefill/decode signatures.
|
| 57 |
-
[+] Bundling .task -> /home/ubuntu/output/translategemma-4b-it-none.task
|
| 58 |
-
[+] BundleConfig params: ['tflite_model', 'tokenizer_model', 'start_token', 'stop_tokens', 'output_filename', 'enable_bytes_to_unicode_mapping', 'system_prompt', 'prompt_prefix_user', 'prompt_suffix_user', 'prompt_prefix_model', 'prompt_suffix_model', 'prompt_prefix_system', 'prompt_suffix_system', 'user_role_token', 'system_role_token', 'model_role_token', 'end_role_token', 'tflite_embedder', 'tflite_per_layer_embedder', 'tflite_vision_encoder', 'tflite_vision_adapter']
|
| 59 |
-
[+] DONE: /home/ubuntu/output/translategemma-4b-it-none.task
|
| 60 |
-
[+] Size: 14810.01 MB
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|