Any-to-Any
Transformers
Safetensors
English
xoron
multimodal
Mixture of Experts
text-to-image
image editing
image to video
text-to-video
video editing
text-to-speech
speech-to-text
speech-to-speech
image-to-text
video-to-text
agentic
tool-use
flow-matching
3d-rope
titok
vidtok
dual-stream-attention
zero-shot-voice-cloning
bigvgan
snake-activation
multi-receptive-field-fusion
custom_code
Instructions to use Backup-bdg/Xoron-Dev-MultiMoe with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use Backup-bdg/Xoron-Dev-MultiMoe with Transformers:
# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("Backup-bdg/Xoron-Dev-MultiMoe", trust_remote_code=True, dtype="auto") - Notebooks
- Google Colab
- Kaggle