How to use mjerome89/moebertyc4stacked with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="mjerome89/moebertyc4stacked", trust_remote_code=True)
# Load model directly from transformers import AutoModelForMaskedLM model = AutoModelForMaskedLM.from_pretrained("mjerome89/moebertyc4stacked", trust_remote_code=True, dtype="auto")
The community tab is the place to discuss and collaborate with the HF community!