How to use suayptalha/MoE-Router-v2 with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-classification", model="suayptalha/MoE-Router-v2")
# Load model directly from transformers import AutoTokenizer, AutoModelForSequenceClassification tokenizer = AutoTokenizer.from_pretrained("suayptalha/MoE-Router-v2") model = AutoModelForSequenceClassification.from_pretrained("suayptalha/MoE-Router-v2")
labels = ['code', 'if', 'math', 'medical']
Files info