runtime error
Exit code: 1. Reason: /lib/python3.11/site-packages/transformers/utils/hub.py:110: FutureWarning: Using `TRANSFORMERS_CACHE` is deprecated and will be removed in v5 of Transformers. Use `HF_HOME` instead. warnings.warn( Traceback (most recent call last): File "<frozen runpy>", line 198, in _run_module_as_main File "<frozen runpy>", line 88, in _run_code File "/home/user/.local/lib/python3.11/site-packages/vllm/entrypoints/openai/api_server.py", line 541, in <module> parser = make_arg_parser(parser) ^^^^^^^^^^^^^^^^^^^^^^^ File "/home/user/.local/lib/python3.11/site-packages/vllm/entrypoints/openai/cli_args.py", line 353, in make_arg_parser parser = AsyncEngineArgs.add_cli_args(parser) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/user/.local/lib/python3.11/site-packages/vllm/engine/arg_utils.py", line 2185, in add_cli_args parser = EngineArgs.add_cli_args(parser) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/user/.local/lib/python3.11/site-packages/vllm/engine/arg_utils.py", line 1241, in add_cli_args vllm_kwargs = get_kwargs(VllmConfig) ^^^^^^^^^^^^^^^^^^^^^^ File "/home/user/.local/lib/python3.11/site-packages/vllm/engine/arg_utils.py", line 358, in get_kwargs return copy.deepcopy(_compute_kwargs(cls)) ^^^^^^^^^^^^^^^^^^^^ File "/home/user/.local/lib/python3.11/site-packages/vllm/engine/arg_utils.py", line 270, in _compute_kwargs default = default.default_factory() # type: ignore[call-arg] ^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/user/.local/lib/python3.11/site-packages/pydantic/_internal/_dataclasses.py", line 121, in __init__ s.__pydantic_validator__.validate_python(ArgsKwargs(args, kwargs), self_instance=s) File "/home/user/.local/lib/python3.11/site-packages/vllm/config/device.py", line 56, in __post_init__ raise RuntimeError( RuntimeError: Failed to infer device type, please set the environment variable `VLLM_LOGGING_LEVEL=DEBUG` to turn on verbose logging to help debug the issue.
Container logs:
Fetching error logs...