File size: 542 Bytes
07b2763
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
---
library_name: transformers
license: apache-2.0
base_model:
- arcee-ai/SuperNova-Medius
tags:
- llmcompressor
---
# SuperNova-Medius-FP8-Dynamic

This is a FP8-quantized version of [arcee-ai/SuperNova-Medius](https://huggingface.co/arcee-ai/SuperNova-Medius) using the [llmcompressor](https://github.com/vllm-project/llm-compressor) library.

For more information about the quantization method, please visit [FP8 documentation used for quantization](https://github.com/vllm-project/llm-compressor/tree/main/examples/quantization_w8a8_fp8).