Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
jmyang
's Collections
Meta APO
Meta APO
updated
1 day ago
Model of MetaAPO https://arxiv.org/abs/2509.23371
Upvote
1
jmyang/llama3.1-8b-rm-ultrafeedback
8B
•
Updated
Nov 15, 2025
•
2
jmyang/llama3.1-8b-dpo-ultrafeedback
8B
•
Updated
Nov 15, 2025
•
3
jmyang/MetaAPO-Llama3.1-8B
0.5B
•
Updated
Jan 2
•
9
•
2
jmyang/Qwen2.5-7B-DPO
8B
•
Updated
Jan 6
•
6
jmyang/Qwen2.5-7B-rm
1B
•
Updated
1 day ago
jmyang/MetaAPO-Qwen2.5-7B
0.5B
•
Updated
1 day ago
Upvote
1
Share collection
View history
Collection guide
Browse collections