runtime error
Exit code: 1. Reason: Downloading GGUF model from HuggingFace... llama-3.2-1b-instruct.Q4_K_M.gguf: 0%| | 0.00/808M [00:00<?, ?B/s][A llama-3.2-1b-instruct.Q4_K_M.gguf: 1%| | 8.68M/808M [00:01<01:32, 8.63MB/s][A llama-3.2-1b-instruct.Q4_K_M.gguf: 63%|██████▎ | 506M/808M [00:02<00:01, 295MB/s] [A llama-3.2-1b-instruct.Q4_K_M.gguf: 100%|██████████| 808M/808M [00:02<00:00, 294MB/s] Model downloaded to: model/llama-3.2-1b-instruct.Q4_K_M.gguf Loading GGUF model with optimized settings... Model loaded successfully! Traceback (most recent call last): File "/app/app.py", line 63, in <module> demo = gr.ChatInterface( TypeError: ChatInterface.__init__() got an unexpected keyword argument 'theme' Exception ignored in: <function Llama.__del__ at 0x7fc04409bac0> Traceback (most recent call last): File "/usr/local/lib/python3.10/site-packages/llama_cpp/llama.py", line 2099, in __del__ File "/usr/local/lib/python3.10/site-packages/llama_cpp/llama.py", line 2096, in close File "/usr/local/lib/python3.10/contextlib.py", line 584, in close File "/usr/local/lib/python3.10/contextlib.py", line 576, in __exit__ File "/usr/local/lib/python3.10/contextlib.py", line 561, in __exit__ File "/usr/local/lib/python3.10/contextlib.py", line 340, in __exit__ File "/usr/local/lib/python3.10/site-packages/llama_cpp/_internals.py", line 66, in close File "/usr/local/lib/python3.10/contextlib.py", line 584, in close File "/usr/local/lib/python3.10/contextlib.py", line 576, in __exit__ File "/usr/local/lib/python3.10/contextlib.py", line 561, in __exit__ File "/usr/local/lib/python3.10/contextlib.py", line 449, in _exit_wrapper File "/usr/local/lib/python3.10/site-packages/llama_cpp/_internals.py", line 60, in free_model TypeError: 'NoneType' object is not callable
Container logs:
Fetching error logs...