Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
cloudyu
/
Mixtral_7Bx5_MoE_30B
like
1
Text Generation
Transformers
Safetensors
mixtral
text-generation-inference
License:
cc-by-nc-4.0
Model card
Files
Files and versions
xet
Community
4
Deploy
Use this model
Add MOE (mixture of experts tag)
#3
by
davanstrien
HF Staff
- opened
Jan 13, 2024
base:
refs/heads/main
←
from:
refs/pr/3
Discussion
Files changed
+3
-1
davanstrien
Jan 13, 2024
No description provided.
Add MOE (mixture of experts tag)
e8487af6
Edit
Preview
Upload images, audio, and videos by dragging in the text input, pasting, or
clicking here
.
Tap or paste here to upload images
Ready to merge
This branch is ready to get merged automatically.
Comment
·
Sign up
or
log in
to comment