How to use ZySec-AI/Mamba-2.8B-CyberSec with Transformers:
# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("ZySec-AI/Mamba-2.8B-CyberSec", dtype="auto")
Hi, just out of curiosity, what hardware is needed to finetune mamba 2.8b?
Lora, not much you just need 8GB VRAM, FULL then 24GB
· Sign up or log in to comment