Hugging Face
Models
Datasets
Spaces
Buckets
new
Docs
Enterprise
Pricing
Log In
Sign Up
mixtao
/
MixTAO-7Bx2-MoE-Instruct-v3.0
like
0
Follow
MixTAO Labs
1
Text Generation
Transformers
Safetensors
mixtral
Mixture of Experts
text-generation-inference
License:
apache-2.0
Model card
Files
Files and versions
xet
Deploy
Use this model
MixTAO-7Bx2-MoE-Instruct-v3.0
π» Usage
1. text-generation-webui
MixTAO-7Bx2-MoE-Instruct-v3.0
MixTAO-7Bx2-MoE-Instruct-v3.0 is a Mixure of Experts (MoE) experiment.
π» Usage
1. text-generation-webui
Model Tab
Parameters Tab
Downloads last month
5
Safetensors
Model size
13B params
Tensor type
F16
Β·
Files info
Inference Providers
NEW
Text Generation
This model isn't deployed by any Inference Provider.
π
Ask for provider support