Hugging Face
Models
Datasets
Spaces
Buckets
new
Docs
Enterprise
Pricing
Log In
Sign Up
mixtao
/
MixTAO-7Bx2-MoE-Instruct-v4.0
like
0
Follow
MixTAO Labs
1
Text Generation
Transformers
Safetensors
mixtral
Mixture of Experts
Merge
text-generation-inference
License:
apache-2.0
Model card
Files
Files and versions
xet
Deploy
Use this model
MixTAO-7Bx2-MoE-Instruct-v4.0
π» Usage
1. text-generation-webui
MixTAO-7Bx2-MoE-Instruct-v4.0
MixTAO-7Bx2-MoE-Instruct-v4.0 is a Mixure of Experts (MoE) testing merge model.
π» Usage
1. text-generation-webui
Model Tab
Downloads last month
75
Safetensors
Model size
13B params
Tensor type
BF16
Β·
Files info
Inference Providers
NEW
Text Generation
This model isn't deployed by any Inference Provider.
π
Ask for provider support