Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
sh0ck0r
/
MiquSuperdark-70B-v2-FP8-Dynamic
like
0
Text Generation
Transformers
Safetensors
llama
fp8
vllm
compressed-tensors
quantized
llmcompressor
conversational
text-generation-inference
License:
apache-2.0
Model card
Files
Files and versions
xet
Community
Deploy
Use this model
main
MiquSuperdark-70B-v2-FP8-Dynamic
69.5 GB
1 contributor
History:
2 commits
sh0ck0r
Upload FP8 quantized version of ddh0/MiquSuperdark-70B-v2
9b53faf
verified
13 days ago
.gitattributes
1.52 kB
initial commit
13 days ago
README.md
3.63 kB
Upload FP8 quantized version of ddh0/MiquSuperdark-70B-v2
13 days ago
chat_template.jinja
470 Bytes
Upload FP8 quantized version of ddh0/MiquSuperdark-70B-v2
13 days ago
config.json
1.88 kB
Upload FP8 quantized version of ddh0/MiquSuperdark-70B-v2
13 days ago
generation_config.json
132 Bytes
Upload FP8 quantized version of ddh0/MiquSuperdark-70B-v2
13 days ago
model-00001-of-00015.safetensors
4.95 GB
xet
Upload FP8 quantized version of ddh0/MiquSuperdark-70B-v2
13 days ago
model-00002-of-00015.safetensors
4.98 GB
xet
Upload FP8 quantized version of ddh0/MiquSuperdark-70B-v2
13 days ago
model-00003-of-00015.safetensors
4.9 GB
xet
Upload FP8 quantized version of ddh0/MiquSuperdark-70B-v2
13 days ago
model-00004-of-00015.safetensors
4.9 GB
xet
Upload FP8 quantized version of ddh0/MiquSuperdark-70B-v2
13 days ago
model-00005-of-00015.safetensors
4.9 GB
xet
Upload FP8 quantized version of ddh0/MiquSuperdark-70B-v2
13 days ago
model-00006-of-00015.safetensors
4.98 GB
xet
Upload FP8 quantized version of ddh0/MiquSuperdark-70B-v2
13 days ago
model-00007-of-00015.safetensors
4.9 GB
xet
Upload FP8 quantized version of ddh0/MiquSuperdark-70B-v2
13 days ago
model-00008-of-00015.safetensors
4.9 GB
xet
Upload FP8 quantized version of ddh0/MiquSuperdark-70B-v2
13 days ago
model-00009-of-00015.safetensors
4.9 GB
xet
Upload FP8 quantized version of ddh0/MiquSuperdark-70B-v2
13 days ago
model-00010-of-00015.safetensors
4.98 GB
xet
Upload FP8 quantized version of ddh0/MiquSuperdark-70B-v2
13 days ago
model-00011-of-00015.safetensors
4.9 GB
xet
Upload FP8 quantized version of ddh0/MiquSuperdark-70B-v2
13 days ago
model-00012-of-00015.safetensors
4.9 GB
xet
Upload FP8 quantized version of ddh0/MiquSuperdark-70B-v2
13 days ago
model-00013-of-00015.safetensors
4.9 GB
xet
Upload FP8 quantized version of ddh0/MiquSuperdark-70B-v2
13 days ago
model-00014-of-00015.safetensors
4.98 GB
xet
Upload FP8 quantized version of ddh0/MiquSuperdark-70B-v2
13 days ago
model-00015-of-00015.safetensors
524 MB
xet
Upload FP8 quantized version of ddh0/MiquSuperdark-70B-v2
13 days ago
model.safetensors.index.json
109 kB
Upload FP8 quantized version of ddh0/MiquSuperdark-70B-v2
13 days ago
recipe.yaml
136 Bytes
Upload FP8 quantized version of ddh0/MiquSuperdark-70B-v2
13 days ago
special_tokens_map.json
548 Bytes
Upload FP8 quantized version of ddh0/MiquSuperdark-70B-v2
13 days ago
tokenizer.json
3.62 MB
Upload FP8 quantized version of ddh0/MiquSuperdark-70B-v2
13 days ago
tokenizer.model
500 kB
xet
Upload FP8 quantized version of ddh0/MiquSuperdark-70B-v2
13 days ago
tokenizer_config.json
991 Bytes
Upload FP8 quantized version of ddh0/MiquSuperdark-70B-v2
13 days ago