Add Nightly Versions of PyTorch, TorchAudio, TorchVision, and xFormers (with CUDA Support)

#3
by donn22 - opened

Hi,

Could you please add nightly versions of the following packages to the environment or builds?

torch

torchaudio

torchvision

xformers

I specifically need the matching PyTorch nightly build with CUDA support for ComfyUI.

I’ve searched across different platforms, but it’s been difficult to find fully compatible nightly versions that align with each other — especially for components like FlashAttention, SageAttention, and xFormers.

If possible, could you at least include xFormers and provide an official link to the latest PyTorch nightly build with CUDA?

Official nightly builds can be found here:
🔗 https://download.pytorch.org/whl/nightly/cu129

I also checked the following sources, but couldn’t find exact matching versions:

https://huggingface.co/Wildminder/AI-windows-whl/tree/main

https://github.com/Rogala/AI_Attention

Thanks in advance for considering this!

PyTorch nightly is 2.10; PyTorch 2.9.0 is stable now. Please open an issue on GitHub with a list of the required packages, and I will compile the necessary .whl files ASAP.

Can you please add an exact matching, working xFormers build for ComfyUI?

Currently, PyTorch nightly is 2.10, while the stable release is 2.9.0.
I need a compiled xFormers wheel that works with Python 3.12 or 3.13.

Additionally, I hope your repositories for FlashAttention and SageAttention will work with these versions as well, though I haven’t yet tested them for exact compatibility with the PyTorch nightly builds (2.10) on Python 3.12 or 3.13.

xformers supports ABI, so works with python >3.9
the latest xformers has been compiled with cuda 130
https://huggingface.co/Wildminder/AI-windows-whl/blob/main/xformers-0.0.33%2B00a7a5f0.d20251021-cp39-abi3-win_amd64.whl

I get this error after installing the latest xformers you provided.

WARNING[XFORMERS]: xFormers can't load C++/CUDA extensions. xFormers was built for:
PyTorch 2.10.0.dev20251019+cu130 with CUDA 1300 (you have 2.9.0+cu130)
Python 3.13.7 (you have 3.12.12)
Please reinstall xformers (see https://github.com/facebookresearch/xformers#installing-xformers)
Memory-efficient attention, SwiGLU, sparse and more won't be available.
Set XFORMERS_MORE_DETAILS=1 for more details
xformers version: 0.0.33+00a7a5f0.d20251021

@baesik please wait a little bit, I will recompile it

If you need support or want to resolve issues faster, join the Telegram channel or post issues on GitHub.

Sign up or log in to comment