runtime error
Exit code: 1. Reason: m self._transform_stream_with_config( File "/usr/local/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 2312, in _transform_stream_with_config chunk: Output = context.run(next, iterator) File "/usr/local/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 3458, in _transform yield from final_pipeline File "/usr/local/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 1534, in transform for ichunk in input: File "/usr/local/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 5750, in transform yield from self.bound.transform( File "/usr/local/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 1552, in transform yield from self.stream(final, config, **kwargs) File "/usr/local/lib/python3.10/site-packages/langchain_core/language_models/chat_models.py", line 523, in stream for chunk in self._stream(input_messages, stop=stop, **kwargs): File "/usr/local/lib/python3.10/site-packages/langchain_huggingface/chat_models/huggingface.py", line 678, in _stream for chunk in self.llm.client.chat_completion( File "/usr/local/lib/python3.10/site-packages/huggingface_hub/inference/_client.py", line 915, in chat_completion data = self._inner_post(request_parameters, stream=stream) File "/usr/local/lib/python3.10/site-packages/huggingface_hub/inference/_client.py", line 275, in _inner_post hf_raise_for_status(response) File "/usr/local/lib/python3.10/site-packages/huggingface_hub/utils/_http.py", line 475, in hf_raise_for_status raise _format(HfHubHTTPError, str(e), response) from e huggingface_hub.errors.HfHubHTTPError: 402 Client Error: Payment Required for url: https://router.huggingface.co/novita/v3/openai/chat/completions (Request ID: Root=1-691ccf0f-5116cfca09f4362f4431e9a4;37ea946d-afae-4f8c-a647-95733b244f0b) You have exceeded your monthly included credits for Inference Providers. Subscribe to PRO to get 20x more monthly included credits. Observ[0m[38;5;200m[1;3mtickets[0m
Container logs:
Fetching error logs...