EVA
Collection
8 items • Updated • 1
This repository contains a Hugging Face export of Mixtral-8x7B-v0.1 quantized with AQLM using the 2-bit 2x8 scheme.
mistralai/Mixtral-8x7B-v0.1AQLM2x82-bit/work/bduan1/quantized_models/Mixtral-8x7B-AQLM-2bit-2x8This repo was produced with convert_to_hf.py from the AQLM project, then exported with --save_safetensors and --save_tokenizer.
from transformers import AutoModelForCausalLM, AutoTokenizer
model_id = "dbw6/Mixtral-8x7B-AQLM-2Bit-2x8-hf"
tokenizer = AutoTokenizer.from_pretrained(model_id)
model = AutoModelForCausalLM.from_pretrained(
model_id,
torch_dtype="auto",
device_map="auto",
trust_remote_code=True,
)
Base model
mistralai/Mixtral-8x7B-v0.1