Buckets:
| <meta charset="utf-8" /><meta name="hf:doc:metadata" content="{"title":"Overview","local":"overview","sections":[{"title":"Resources","local":"resources","sections":[],"depth":2},{"title":"User-Friendly Quantization Tools","local":"user-friendly-quantization-tools","sections":[],"depth":2}],"depth":1}"> | |
| <link href="/docs/transformers/pr_33892/en/_app/immutable/assets/0.e3b0c442.css" rel="modulepreload"> | |
| <link rel="modulepreload" href="/docs/transformers/pr_33892/en/_app/immutable/entry/start.b2c4257a.js"> | |
| <link rel="modulepreload" href="/docs/transformers/pr_33892/en/_app/immutable/chunks/scheduler.31fdf58d.js"> | |
| <link rel="modulepreload" href="/docs/transformers/pr_33892/en/_app/immutable/chunks/singletons.9860629f.js"> | |
| <link rel="modulepreload" href="/docs/transformers/pr_33892/en/_app/immutable/chunks/index.252883d5.js"> | |
| <link rel="modulepreload" href="/docs/transformers/pr_33892/en/_app/immutable/chunks/paths.e85c0ec8.js"> | |
| <link rel="modulepreload" href="/docs/transformers/pr_33892/en/_app/immutable/entry/app.05ef1f97.js"> | |
| <link rel="modulepreload" href="/docs/transformers/pr_33892/en/_app/immutable/chunks/preload-helper.40847a0e.js"> | |
| <link rel="modulepreload" href="/docs/transformers/pr_33892/en/_app/immutable/chunks/index.2f76fdf0.js"> | |
| <link rel="modulepreload" href="/docs/transformers/pr_33892/en/_app/immutable/nodes/0.ca4aafa4.js"> | |
| <link rel="modulepreload" href="/docs/transformers/pr_33892/en/_app/immutable/chunks/each.e59479a4.js"> | |
| <link rel="modulepreload" href="/docs/transformers/pr_33892/en/_app/immutable/nodes/538.5c88d6c0.js"> | |
| <link rel="modulepreload" href="/docs/transformers/pr_33892/en/_app/immutable/chunks/CopyLLMTxtMenu.ff482081.js"> | |
| <link rel="modulepreload" href="/docs/transformers/pr_33892/en/_app/immutable/chunks/MermaidChart.svelte_svelte_type_style_lang.71f274cc.js"> | |
| <link rel="modulepreload" href="/docs/transformers/pr_33892/en/_app/immutable/chunks/IconCopy.ac192424.js"><!-- HEAD_svelte-u9bgzb_START --><meta name="hf:doc:metadata" content="{"title":"Overview","local":"overview","sections":[{"title":"Resources","local":"resources","sections":[],"depth":2},{"title":"User-Friendly Quantization Tools","local":"user-friendly-quantization-tools","sections":[],"depth":2}],"depth":1}"><!-- HEAD_svelte-u9bgzb_END --> <p></p> <div class="items-center shrink-0 min-w-[100px] max-sm:min-w-[50px] justify-end ml-auto flex" style="float: right; margin-left: 10px; display: inline-flex; position: relative; z-index: 10;"><div class="inline-flex rounded-md max-sm:rounded-sm"><button class="inline-flex items-center gap-1 max-sm:gap-0.5 h-6 max-sm:h-5 px-2 max-sm:px-1.5 text-[11px] max-sm:text-[9px] font-medium text-gray-800 border border-r-0 rounded-l-md max-sm:rounded-l-sm border-gray-200 bg-white hover:shadow-inner dark:border-gray-850 dark:bg-gray-950 dark:text-gray-200 dark:hover:bg-gray-800" aria-live="polite"><span class="inline-flex items-center justify-center rounded-md p-0.5 max-sm:p-0"><svg class="w-3 h-3 max-sm:w-2.5 max-sm:h-2.5" xmlns="http://www.w3.org/2000/svg" aria-hidden="true" fill="currentColor" focusable="false" role="img" width="1em" height="1em" preserveAspectRatio="xMidYMid meet" viewBox="0 0 32 32"><path d="M28,10V28H10V10H28m0-2H10a2,2,0,0,0-2,2V28a2,2,0,0,0,2,2H28a2,2,0,0,0,2-2V10a2,2,0,0,0-2-2Z" transform="translate(0)"></path><path d="M4,18H2V4A2,2,0,0,1,4,2H18V4H4Z" transform="translate(0)"></path><rect fill="none" width="32" height="32"></rect></svg></span> <span>Copy page</span></button> <button class="inline-flex items-center justify-center w-6 max-sm:w-5 h-6 max-sm:h-5 disabled:pointer-events-none text-sm text-gray-500 hover:text-gray-700 dark:hover:text-white rounded-r-md max-sm:rounded-r-sm border border-l transition border-gray-200 bg-white hover:shadow-inner dark:border-gray-850 dark:bg-gray-950 dark:text-gray-200 dark:hover:bg-gray-800" aria-haspopup="menu" aria-expanded="false" aria-label="Open copy menu"><svg class="transition-transform text-gray-400 overflow-visible w-3 h-3 max-sm:w-2.5 max-sm:h-2.5 rotate-0" width="1em" height="1em" viewBox="0 0 12 7" fill="none" xmlns="http://www.w3.org/2000/svg"><path d="M1 1L6 6L11 1" stroke="currentColor"></path></svg></button></div> </div> <h1 class="relative group"><a id="overview" class="header-link block pr-1.5 text-lg no-hover:hidden with-hover:absolute with-hover:p-1.5 with-hover:opacity-0 with-hover:group-hover:opacity-100 with-hover:right-full" href="#overview"><span><svg class="" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" aria-hidden="true" role="img" width="1em" height="1em" preserveAspectRatio="xMidYMid meet" viewBox="0 0 256 256"><path d="M167.594 88.393a8.001 8.001 0 0 1 0 11.314l-67.882 67.882a8 8 0 1 1-11.314-11.315l67.882-67.881a8.003 8.003 0 0 1 11.314 0zm-28.287 84.86l-28.284 28.284a40 40 0 0 1-56.567-56.567l28.284-28.284a8 8 0 0 0-11.315-11.315l-28.284 28.284a56 56 0 0 0 79.196 79.197l28.285-28.285a8 8 0 1 0-11.315-11.314zM212.852 43.14a56.002 56.002 0 0 0-79.196 0l-28.284 28.284a8 8 0 1 0 11.314 11.314l28.284-28.284a40 40 0 0 1 56.568 56.567l-28.285 28.285a8 8 0 0 0 11.315 11.314l28.284-28.284a56.065 56.065 0 0 0 0-79.196z" fill="currentColor"></path></svg></span></a> <span>Overview</span></h1> <p data-svelte-h="svelte-656w9r">Quantization lowers the memory requirements of loading and using a model by storing the weights in a lower precision while trying to preserve as much accuracy as possible. Weights are typically stored in full-precision (fp32) floating point representations, but half-precision (fp16 or bf16) are increasingly popular data types given the large size of models today. Some quantization methods can reduce the precision even further to integer representations, like int8 or int4.</p> <p data-svelte-h="svelte-17h48vd">Transformers supports many quantization methods, each with their pros and cons, so you can pick the best one for your specific use case. Some methods require calibration for greater accuracy and extreme compression (1-2 bits), while other methods work out of the box with on-the-fly quantization.</p> <p data-svelte-h="svelte-1drhu8y">Use the Space below to help you pick a quantization method depending on your hardware and number of bits to quantize to.</p> <table data-svelte-h="svelte-t72c09"><thead><tr><th>Quantization Method</th> <th>On the fly quantization</th> <th>CPU</th> <th>CUDA GPU</th> <th>ROCm GPU</th> <th>Metal (Apple Silicon)</th> <th>Intel GPU</th> <th>Torch compile()</th> <th>Bits</th> <th>PEFT Fine Tuning</th> <th>Serializable with π€Transformers</th> <th>π€Transformers Support</th> <th>Link to library</th></tr></thead> <tbody><tr><td><a href="./aqlm">AQLM</a></td> <td>π΄</td> <td>π’</td> <td>π’</td> <td>π΄</td> <td>π΄</td> <td>π’</td> <td>π’</td> <td>1/2</td> <td>π’</td> <td>π’</td> <td>π’</td> <td><a href="https://github.com/Vahe1994/AQLM" rel="nofollow">https://github.com/Vahe1994/AQLM</a></td></tr> <tr><td><a href="./auto_round">AutoRound</a></td> <td>π΄</td> <td>π’</td> <td>π’</td> <td>π΄</td> <td>π΄</td> <td>π’</td> <td>π΄</td> <td>2/3/4/8</td> <td>π΄</td> <td>π’</td> <td>π’</td> <td><a href="https://github.com/intel/auto-round" rel="nofollow">https://github.com/intel/auto-round</a></td></tr> <tr><td><a href="./awq">AWQ</a></td> <td>π΄</td> <td>π’</td> <td>π’</td> <td>π’</td> <td>π΄</td> <td>π’</td> <td>?</td> <td>4</td> <td>π’</td> <td>π’</td> <td>π’</td> <td><a href="https://github.com/casper-hansen/AutoAWQ" rel="nofollow">https://github.com/casper-hansen/AutoAWQ</a></td></tr> <tr><td><a href="./bitsandbytes">bitsandbytes</a></td> <td>π’</td> <td>π’</td> <td>π’</td> <td>π‘</td> <td>π‘</td> <td>π’</td> <td>π’</td> <td>4/8</td> <td>π’</td> <td>π’</td> <td>π’</td> <td><a href="https://github.com/bitsandbytes-foundation/bitsandbytes" rel="nofollow">https://github.com/bitsandbytes-foundation/bitsandbytes</a></td></tr> <tr><td><a href="./compressed_tensors">compressed-tensors</a></td> <td>π΄</td> <td>π’</td> <td>π’</td> <td>π’</td> <td>π΄</td> <td>π΄</td> <td>π΄</td> <td>1/8</td> <td>π’</td> <td>π’</td> <td>π’</td> <td><a href="https://github.com/neuralmagic/compressed-tensors" rel="nofollow">https://github.com/neuralmagic/compressed-tensors</a></td></tr> <tr><td><a href="./eetq">EETQ</a></td> <td>π’</td> <td>π΄</td> <td>π’</td> <td>π΄</td> <td>π΄</td> <td>π΄</td> <td>?</td> <td>8</td> <td>π’</td> <td>π’</td> <td>π’</td> <td><a href="https://github.com/NetEase-FuXi/EETQ" rel="nofollow">https://github.com/NetEase-FuXi/EETQ</a></td></tr> <tr><td><a href="./fp_quant">FP-Quant</a></td> <td>π’</td> <td>π΄</td> <td>π’</td> <td>π΄</td> <td>π΄</td> <td>π΄</td> <td>π’</td> <td>4</td> <td>π΄</td> <td>π’</td> <td>π’</td> <td><a href="https://github.com/IST-DASLab/FP-Quant" rel="nofollow">https://github.com/IST-DASLab/FP-Quant</a></td></tr> <tr><td><a href="../gguf">GGUF / GGML (llama.cpp)</a></td> <td>π’</td> <td>π’</td> <td>π’</td> <td>π΄</td> <td>π’</td> <td>π’</td> <td>π΄</td> <td>1/8</td> <td>π΄</td> <td><a href="../gguf">See Notes</a></td> <td><a href="../gguf">See Notes</a></td> <td><a href="https://github.com/ggerganov/llama.cpp" rel="nofollow">https://github.com/ggerganov/llama.cpp</a></td></tr> <tr><td><a href="./gptq">GPTQModel</a></td> <td>π΄</td> <td>π’</td> <td>π’</td> <td>π’</td> <td>π’</td> <td>π’</td> <td>π΄</td> <td>2/3/4/8</td> <td>π’</td> <td>π’</td> <td>π’</td> <td><a href="https://github.com/ModelCloud/GPTQModel" rel="nofollow">https://github.com/ModelCloud/GPTQModel</a></td></tr> <tr><td><a href="./gptq">AutoGPTQ</a></td> <td>π΄</td> <td>π΄</td> <td>π’</td> <td>π’</td> <td>π΄</td> <td>π΄</td> <td>π΄</td> <td>2/3/4/8</td> <td>π’</td> <td>π’</td> <td>π’</td> <td><a href="https://github.com/AutoGPTQ/AutoGPTQ" rel="nofollow">https://github.com/AutoGPTQ/AutoGPTQ</a></td></tr> <tr><td><a href="./higgs">HIGGS</a></td> <td>π’</td> <td>π΄</td> <td>π’</td> <td>π΄</td> <td>π΄</td> <td>π΄</td> <td>π’</td> <td>2/4</td> <td>π΄</td> <td>π’</td> <td>π’</td> <td><a href="https://github.com/HanGuo97/flute" rel="nofollow">https://github.com/HanGuo97/flute</a></td></tr> <tr><td><a href="./hqq">HQQ</a></td> <td>π’</td> <td>π’</td> <td>π’</td> <td>π΄</td> <td>π΄</td> <td>π’</td> <td>π’</td> <td>1/8</td> <td>π’</td> <td>π΄</td> <td>π’</td> <td><a href="https://github.com/mobiusml/hqq/" rel="nofollow">https://github.com/mobiusml/hqq/</a></td></tr> <tr><td><a href="./quanto">optimum-quanto</a></td> <td>π’</td> <td>π’</td> <td>π’</td> <td>π΄</td> <td>π’</td> <td>π’</td> <td>π’</td> <td>2/4/8</td> <td>π΄</td> <td>π΄</td> <td>π’</td> <td><a href="https://github.com/huggingface/optimum-quanto" rel="nofollow">https://github.com/huggingface/optimum-quanto</a></td></tr> <tr><td><a href="./fbgemm_fp8">FBGEMM_FP8</a></td> <td>π’</td> <td>π΄</td> <td>π’</td> <td>π΄</td> <td>π΄</td> <td>π΄</td> <td>π΄</td> <td>8</td> <td>π΄</td> <td>π’</td> <td>π’</td> <td><a href="https://github.com/pytorch/FBGEMM" rel="nofollow">https://github.com/pytorch/FBGEMM</a></td></tr> <tr><td><a href="./torchao">torchao</a></td> <td>π’</td> <td>π’</td> <td>π’</td> <td>π΄</td> <td>π‘</td> <td>π’</td> <td></td> <td>4/8</td> <td></td> <td>π’π΄</td> <td>π’</td> <td><a href="https://github.com/pytorch/ao" rel="nofollow">https://github.com/pytorch/ao</a></td></tr> <tr><td><a href="./vptq">VPTQ</a></td> <td>π΄</td> <td>π΄</td> <td>π’</td> <td>π‘</td> <td>π΄</td> <td>π΄</td> <td>π’</td> <td>1/8</td> <td>π΄</td> <td>π’</td> <td>π’</td> <td><a href="https://github.com/microsoft/VPTQ" rel="nofollow">https://github.com/microsoft/VPTQ</a></td></tr> <tr><td><a href="./finegrained_fp8">FINEGRAINED_FP8</a></td> <td>π’</td> <td>π΄</td> <td>π’</td> <td>π΄</td> <td>π΄</td> <td>π’</td> <td>π΄</td> <td>8</td> <td>π΄</td> <td>π’</td> <td>π’</td> <td></td></tr> <tr><td><a href="./spqr">SpQR</a></td> <td>π΄</td> <td>π΄</td> <td>π’</td> <td>π΄</td> <td>π΄</td> <td>π΄</td> <td>π’</td> <td>3</td> <td>π΄</td> <td>π’</td> <td>π’</td> <td><a href="https://github.com/Vahe1994/SpQR/" rel="nofollow">https://github.com/Vahe1994/SpQR/</a></td></tr> <tr><td><a href="./quark">Quark</a></td> <td>π΄</td> <td>π’</td> <td>π’</td> <td>π’</td> <td>π’</td> <td>π’</td> <td>?</td> <td>2/4/6/8/9/16</td> <td>π΄</td> <td>π΄</td> <td>π’</td> <td><a href="https://quark.docs.amd.com/latest/" rel="nofollow">https://quark.docs.amd.com/latest/</a></td></tr></tbody></table> <h2 class="relative group"><a id="resources" class="header-link block pr-1.5 text-lg no-hover:hidden with-hover:absolute with-hover:p-1.5 with-hover:opacity-0 with-hover:group-hover:opacity-100 with-hover:right-full" href="#resources"><span><svg class="" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" aria-hidden="true" role="img" width="1em" height="1em" preserveAspectRatio="xMidYMid meet" viewBox="0 0 256 256"><path d="M167.594 88.393a8.001 8.001 0 0 1 0 11.314l-67.882 67.882a8 8 0 1 1-11.314-11.315l67.882-67.881a8.003 8.003 0 0 1 11.314 0zm-28.287 84.86l-28.284 28.284a40 40 0 0 1-56.567-56.567l28.284-28.284a8 8 0 0 0-11.315-11.315l-28.284 28.284a56 56 0 0 0 79.196 79.197l28.285-28.285a8 8 0 1 0-11.315-11.314zM212.852 43.14a56.002 56.002 0 0 0-79.196 0l-28.284 28.284a8 8 0 1 0 11.314 11.314l28.284-28.284a40 40 0 0 1 56.568 56.567l-28.285 28.285a8 8 0 0 0 11.315 11.314l28.284-28.284a56.065 56.065 0 0 0 0-79.196z" fill="currentColor"></path></svg></span></a> <span>Resources</span></h2> <p data-svelte-h="svelte-yp93ig">If you are new to quantization, we recommend checking out these beginner-friendly quantization courses in collaboration with DeepLearning.AI.</p> <ul data-svelte-h="svelte-1xrrw6l"><li><a href="https://www.deeplearning.ai/short-courses/quantization-fundamentals-with-hugging-face/" rel="nofollow">Quantization Fundamentals with Hugging Face</a></li> <li><a href="https://www.deeplearning.ai/short-courses/quantization-in-depth" rel="nofollow">Quantization in Depth</a></li></ul> <h2 class="relative group"><a id="user-friendly-quantization-tools" class="header-link block pr-1.5 text-lg no-hover:hidden with-hover:absolute with-hover:p-1.5 with-hover:opacity-0 with-hover:group-hover:opacity-100 with-hover:right-full" href="#user-friendly-quantization-tools"><span><svg class="" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" aria-hidden="true" role="img" width="1em" height="1em" preserveAspectRatio="xMidYMid meet" viewBox="0 0 256 256"><path d="M167.594 88.393a8.001 8.001 0 0 1 0 11.314l-67.882 67.882a8 8 0 1 1-11.314-11.315l67.882-67.881a8.003 8.003 0 0 1 11.314 0zm-28.287 84.86l-28.284 28.284a40 40 0 0 1-56.567-56.567l28.284-28.284a8 8 0 0 0-11.315-11.315l-28.284 28.284a56 56 0 0 0 79.196 79.197l28.285-28.285a8 8 0 1 0-11.315-11.314zM212.852 43.14a56.002 56.002 0 0 0-79.196 0l-28.284 28.284a8 8 0 1 0 11.314 11.314l28.284-28.284a40 40 0 0 1 56.568 56.567l-28.285 28.285a8 8 0 0 0 11.315 11.314l28.284-28.284a56.065 56.065 0 0 0 0-79.196z" fill="currentColor"></path></svg></span></a> <span>User-Friendly Quantization Tools</span></h2> <p data-svelte-h="svelte-1nulxwq">If you are looking for a user-friendly quantization experience, you can use the following community spaces and notebooks:</p> <ul data-svelte-h="svelte-imzktw"><li><a href="https://huggingface.co/spaces/bnb-community/bnb-my-repo" rel="nofollow">Bitsandbytes Space</a></li> <li><a href="https://huggingface.co/spaces/ggml-org/gguf-my-repo" rel="nofollow">GGUF Space</a></li> <li><a href="https://huggingface.co/spaces/mlx-community/mlx-my-repo" rel="nofollow">MLX Space</a></li> <li><a href="https://colab.research.google.com/drive/1b6nqC7UZVt8bx4MksX7s656GXPM-eWw4?usp=sharing#scrollTo=ZC9Nsr9u5WhN" rel="nofollow">AuoQuant Notebook</a></li></ul> <a class="!text-gray-400 !no-underline text-sm flex items-center not-prose mt-4" href="https://github.com/huggingface/transformers/blob/main/docs/source/en/quantization/overview.md" target="_blank"><svg class="mr-1" xmlns="http://www.w3.org/2000/svg" aria-hidden="true" fill="currentColor" focusable="false" role="img" width="1em" height="1em" preserveAspectRatio="xMidYMid meet" viewBox="0 0 32 32"><path d="M31,16l-7,7l-1.41-1.41L28.17,16l-5.58-5.59L24,9l7,7z"></path><path d="M1,16l7-7l1.41,1.41L3.83,16l5.58,5.59L8,23l-7-7z"></path><path d="M12.419,25.484L17.639,6.552l1.932,0.518L14.351,26.002z"></path></svg> <span data-svelte-h="svelte-zjs2n5"><span class="underline">Update</span> on GitHub</span></a> <p></p> | |
| <script> | |
| { | |
| __sveltekit_16tnnm8 = { | |
| assets: "/docs/transformers/pr_33892/en", | |
| base: "/docs/transformers/pr_33892/en", | |
| env: {} | |
| }; | |
| const element = document.currentScript.parentElement; | |
| const data = [null,null]; | |
| Promise.all([ | |
| import("/docs/transformers/pr_33892/en/_app/immutable/entry/start.b2c4257a.js"), | |
| import("/docs/transformers/pr_33892/en/_app/immutable/entry/app.05ef1f97.js") | |
| ]).then(([kit, app]) => { | |
| kit.start(app, element, { | |
| node_ids: [0, 538], | |
| data, | |
| form: null, | |
| error: null | |
| }); | |
| }); | |
| } | |
| </script> | |
Xet Storage Details
- Size:
- 17.7 kB
- Xet hash:
- 4afb55fc58b51efae7a2ca6968ea9fd874c9dae94cad0ee8dc22b94344aac3aa
Β·
Xet efficiently stores files, intelligently splitting them into unique chunks and accelerating uploads and downloads. More info.