dpo_test_demo / README.md
raniero's picture
Upload README.md with huggingface_hub
8a8751d verified
metadata
base_model: meta-llama/Llama-2-7b-hf
tags:
  - LoRA
  - DPO
license: apache-2.0

Submission dpo_test_demo

SHA256: 7bd243dda3f16d8f64ff353e80937e1bcb4a73d8ca75d223a22da6aeeb0283de Timestamp: 2025-08-08T13:14:27.858532