Is it possible for vllm (or other) to run this NVFP4 on RTX 5090 with 128GB RAM?

#6
by vladulidlo - opened

I tried multiple vllm installs including 0.16.0 final and all I get are errors with many different combinations of parameters including cpu offloading. Consulted with ChatGPT, tried over 100 attempts but it never worked for me. Always crashing:

(EngineCore_DP0 pid=84120) WARNING 02-27 14:11:29 [decorators.py:590] unable to save AOT compiled function to /home/vlady/.cache/vllm/torch_compile_cache/torch_aot_compile/60dd78938ac85ab23fa92572573f3271e84bc07d0b4c16e1e978a186b85cc995/rank_0_0/model:
(EngineCore_DP0 pid=84120) INFO 02-27 14:11:30 [gpu_worker.py:405] Available KV cache memory: 0.94 GiB
(EngineCore_DP0 pid=84120) INFO 02-27 14:11:30 [kv_cache_utils.py:1314] GPU KV cache size: 9,792 tokens
(EngineCore_DP0 pid=84120) INFO 02-27 14:11:30 [kv_cache_utils.py:1319] Maximum concurrency for 16,384 tokens per request: 2.21x
(EngineCore_DP0 pid=84120) ERROR 02-27 14:11:30 [core.py:1078] EngineCore failed to start.

or

torch.OutOfMemoryError: CUDA out of memory. Tried to allocate 512.00 MiB. GPU 0 has a total capacity of 31.36 GiB of which 357.62 MiB is free. Process 3134 has 703.11 MiB memory in use. Including non-PyTorch memory, this process has 29.33 GiB memory in use. Of the allocated memory 28.54 GiB is allocated by PyTorch, and 143.66 MiB is reserved by PyTorch but unallocated. If reserved but unallocated memory is large try setting PYTORCH_CUDA_ALLOC_CONF=expandable_segments:True to avoid fragmentation. See documentation for Memory Management (https://pytorch.org/docs/stable/notes/cuda.html#environment-variables)
[rank0]:[W227 14:23:16.541489619 ProcessGroupNCCL.cpp:1524] Warning: WARNING: destroy_process_group() was not called before program exit, which can leak resources. For more info, please see https://pytorch.org/docs/stable/distributed.html#shutdown (function operator())

vladulidlo changed discussion title from Is it possible to run this on RTX 5090 with 128GB RAM? to Is it possible to run this NVFP4 on RTX 5090 with 128GB RAM?
vladulidlo changed discussion title from Is it possible to run this NVFP4 on RTX 5090 with 128GB RAM? to Is it possible fro vllm (or other) to run this NVFP4 on RTX 5090 with 128GB RAM?
vladulidlo changed discussion title from Is it possible fro vllm (or other) to run this NVFP4 on RTX 5090 with 128GB RAM? to Is it possible for vllm (or other) to run this NVFP4 on RTX 5090 with 128GB RAM?

Sign up or log in to comment