runtime error
Exit code: 1. Reason: /usr/local/lib/python3.13/site-packages/smolagents/models.py:857: FutureWarning: The 'model_id' parameter will be required in version 2.0.0. Please update your code to pass this parameter to avoid future errors. For now, it defaults to 'anthropic/claude-3-5-sonnet-20240620'. warnings.warn( Traceback (most recent call last): File "/usr/local/lib/python3.13/site-packages/smolagents/models.py", line 881, in create_client import litellm ModuleNotFoundError: No module named 'litellm' The above exception was the direct cause of the following exception: Traceback (most recent call last): File "/app/app.py", line 60, in <module> model = LiteLLMModel( model="gpt-3.5-turbo", # Free model that works api_base="https://api.openai.com/v1", # OpenAI endpoint temperature=0.5 ) File "/usr/local/lib/python3.13/site-packages/smolagents/models.py", line 871, in __init__ super().__init__( ~~~~~~~~~~~~~~~~^ model_id=model_id, ^^^^^^^^^^^^^^^^^^ ...<2 lines>... **kwargs, ^^^^^^^^^ ) ^ File "/usr/local/lib/python3.13/site-packages/smolagents/models.py", line 809, in __init__ self.client = self.create_client() ~~~~~~~~~~~~~~~~~~^^ File "/usr/local/lib/python3.13/site-packages/smolagents/models.py", line 883, in create_client raise ModuleNotFoundError( "Please install 'litellm' extra to use LiteLLMModel: `pip install 'smolagents[litellm]'`" ) from e ModuleNotFoundError: Please install 'litellm' extra to use LiteLLMModel: `pip install 'smolagents[litellm]'`
Container logs:
Fetching error logs...