runtime error
Exit code: 1. Reason: /usr/local/lib/python3.10/site-packages/transformers/tokenization_utils_base.py:1601: FutureWarning: `clean_up_tokenization_spaces` was not set. It will be set to `True` by default. This behavior will be depracted in transformers v4.45, and will be then set to `False` by default. For more details check this issue: https://github.com/huggingface/transformers/issues/31884 warnings.warn( WARNING! max_length is not default parameter. max_length was transferred to model_kwargs. Please make sure that max_length is what you intended. The token has not been saved to the git credentials helper. Pass `add_to_git_credential=True` in this function directly or `--add-to-git-credential` if using via `huggingface-cli` if you want to set the git credential as well. Token is valid (permission: write). Your token has been saved to /home/user/.cache/huggingface/token Login successful /home/user/app/app.py:31: LangChainDeprecationWarning: The class `Chroma` was deprecated in LangChain 0.2.9 and will be removed in 1.0. An updated version of the class exists in the langchain-chroma package and should be used instead. To use it run `pip install -U langchain-chroma` and import as `from langchain_chroma import Chroma`. vectorstore = Chroma(embedding_function=embedding_model, persist_directory="chroma_db") Traceback (most recent call last): File "/home/user/app/app.py", line 62, in <module> doc_upload = gr.File(label="Upload your PDF documents", file_types=[".pdf"], multiple=True) File "/usr/local/lib/python3.10/site-packages/gradio/component_meta.py", line 167, in wrapper return fn(self, **kwargs) TypeError: File.__init__() got an unexpected keyword argument 'multiple'
Container logs:
Fetching error logs...