How to use OpenBabylon/MamayLM-ORPO-align-lora64 with Transformers:
# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("OpenBabylon/MamayLM-ORPO-align-lora64", dtype="auto")
How to fix it?