Instructions to use BAAI/Emu3.5-Image with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use BAAI/Emu3.5-Image with Transformers:
# Load model directly from transformers import AutoModelForCausalLM model = AutoModelForCausalLM.from_pretrained("BAAI/Emu3.5-Image", dtype="auto") - Notebooks
- Google Colab
- Kaggle
newbie need help on installing errors
#5 opened 6 months ago
by
lhxmailbox
where can I find the GenerationMixin, generation config, and inheritance files? When will you have the advanced decoder available?
#4 opened 6 months ago
by
EricRollei
QUANTIZED MODEL
👍 8
2
#1 opened 7 months ago
by
parthwagh