Hugging Face
Models
Datasets
Spaces
Buckets
new
Docs
Enterprise
Pricing
Log In
Sign Up
tensorblock
/
DPO_model-GGUF
like
0
Follow
TensorBlock
296
Transformers
GGUF
TensorBlock
GGUF
conversational
Model card
Files
Files and versions
xet
Community
Deploy
Use this model
main
DPO_model-GGUF
/
DPO_model-Q2_K.gguf
Commit History
Upload folder using huggingface_hub
d5598d7
verified
morriszms
commited on
Dec 3, 2024