Buckets:

HuggingFaceDocBuilder's picture
|
download
raw
1.1 kB

xFormers

We recommend xFormers for both inference and training. In our tests, the optimizations performed in the attention blocks allow for both faster speed and reduced memory consumption.

Install xFormers from pip:

pip install xformers

The xFormers pip package requires the latest version of PyTorch. If you need to use a previous version of PyTorch, then we recommend installing xFormers from the source.

After xFormers is installed, you can use it with set_attention_backend() as shown in the Attention backends guide.

According to this issue, xFormers v0.0.16 cannot be used for training (fine-tune or DreamBooth) in some GPUs. If you observe this problem, please install a development version as indicated in the issue comments.

Xet Storage Details

Size:
1.1 kB
·
Xet hash:
14aea305734609e15a67b001513fde2d0b2ffa3178b5c28b27a5f56a9653dd43

Xet efficiently stores files, intelligently splitting them into unique chunks and accelerating uploads and downloads. More info.