# Load model directly
from transformers import AutoTokenizer, GLMRobertaLargeForMabel
tokenizer = AutoTokenizer.from_pretrained("zd21/mabel-glm-roberta-large", trust_remote_code=True)
model = GLMRobertaLargeForMabel.from_pretrained("zd21/mabel-glm-roberta-large", trust_remote_code=True)Quick Links
- Downloads last month
- -
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support
# Gated model: Login with a HF token with gated access permission hf auth login