Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
Long-Short-Term-Midgets
/
dpo-adapter-v1
like
0
Follow
Long-Short-Term-Midgets
3
TensorBoard
Safetensors
Model card
Files
Files and versions
xet
Metrics
Training metrics
Community
main
dpo-adapter-v1
/
training_args.bin
Commit History
dpo upload
bcb3d2e
verified
Dapinsky
commited on
May 27, 2024