Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
anthonym21
/
Eve-2-MoE-IT-272M
like
0
Text Generation
Transformers
Safetensors
PyTorch
mlabonne/open-perfectblend
English
eve_moe
Mixture of Experts
deepseek
instruction-tuned
nvidia-h200
nano-lm
edge-ai
swarm-foundation
custom_code
License:
mit
Model card
Files
Files and versions
xet
Community
Deploy
Use this model
main
Eve-2-MoE-IT-272M
1.19 GB
1 contributor
History:
17 commits
anthonym21
Update README.md
7ca18e0
verified
1 day ago
.gitattributes
1.57 kB
Add Eve-2 swarm logo
1 day ago
README.md
5.79 kB
Update README.md
1 day ago
config.json
650 Bytes
Upload folder using huggingface_hub
2 days ago
configuration_eve.py
1.56 kB
Upload folder using huggingface_hub
3 days ago
eve-2-swarm.jpg
160 kB
xet
Add Eve-2 swarm logo
1 day ago
generation_config.json
152 Bytes
Eve-2-MoE-IT-272M: heavy IT patch (open-perfectblend, LoRA r=128, merged)
3 days ago
model.safetensors
1.19 GB
xet
Eve-2-MoE-IT-272M: heavy IT patch (open-perfectblend, LoRA r=128, merged)
3 days ago
modeling_eve.py
11.3 kB
Upload folder using huggingface_hub
2 days ago
push_to_hub.py
435 Bytes
Upload folder using huggingface_hub
2 days ago
tokenizer.json
3.56 MB
Eve-2-MoE-IT-272M: heavy IT patch (open-perfectblend, LoRA r=128, merged)
3 days ago
tokenizer_config.json
297 Bytes
Upload folder using huggingface_hub
3 days ago