runtime error
Exit code: 1. Reason: config.json: 0%| | 0.00/1.53k [00:00<?, ?B/s][A config.json: 100%|ββββββββββ| 1.53k/1.53k [00:00<00:00, 5.22MB/s] tokenizer_config.json: 0%| | 0.00/2.50k [00:00<?, ?B/s][A tokenizer_config.json: 100%|ββββββββββ| 2.50k/2.50k [00:00<00:00, 10.9MB/s] tokenizer.json: 0%| | 0.00/2.42M [00:00<?, ?B/s][A tokenizer.json: 100%|ββββββββββ| 2.42M/2.42M [00:00<00:00, 20.2MB/s] special_tokens_map.json: 0%| | 0.00/2.20k [00:00<?, ?B/s][A special_tokens_map.json: 100%|ββββββββββ| 2.20k/2.20k [00:00<00:00, 10.2MB/s] pytorch_model.bin: 0%| | 0.00/308M [00:00<?, ?B/s][A pytorch_model.bin: 22%|βββ | 67.0M/308M [00:03<00:11, 21.2MB/s][A pytorch_model.bin: 100%|ββββββββββ| 308M/308M [00:03<00:00, 93.2MB/s] Loading weights: 0%| | 0/192 [00:00<?, ?it/s][A Loading weights: 100%|ββββββββββ| 192/192 [00:00<00:00, 23499.56it/s] [transformers] The tied weights mapping and config for this model specifies to tie shared.weight to lm_head.weight, but both are present in the checkpoints with different values, so we will NOT tie them. You should update the config with `tie_word_embeddings=False` to silence this warning. generation_config.json: 0%| | 0.00/142 [00:00<?, ?B/s][A generation_config.json: 100%|ββββββββββ| 142/142 [00:00<00:00, 584kB/s] Traceback (most recent call last): File "/app/app.py", line 50, in <module> demo = gr.ChatInterface( fn=tinislm_chat, ...<10 lines>... clear_btn="ποΈ Π£Π΄Π°Π»ΠΈΡΡ Π²ΡΡ", ) TypeError: ChatInterface.__init__() got an unexpected keyword argument 'theme' model.safetensors: 0%| | 0.00/308M [00:00<?, ?B/s][A model.safetensors: 100%|ββββββββββ| 308M/308M [00:02<00:00, 132MB/s][A model.safetensors: 100%|ββββββββββ| 308M/308M [00:02<00:00, 131MB/s]
Container logs:
Fetching error logs...