fix: Replace incorrect 1.2B config with correct 3B ORPO config (hidden_size=3072, 28 layers, vocab=64256) e5b6ed2 verified pathcosmos commited on 14 days ago