How to use OpenBabylon/MamayLM-ORPO-align-lora256 with Transformers:
# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("OpenBabylon/MamayLM-ORPO-align-lora256", dtype="auto")
How to fix it?