Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
Cherran
/
medical_gemma_DPO_EP2
like
0
PEFT
Safetensors
arxiv:
1910.09700
Model card
Files
Files and versions
xet
Community
Use this model
main
medical_gemma_DPO_EP2
691 MB
1 contributor
History:
2 commits
Cherran
Upload folder using huggingface_hub
a4a1d6e
verified
11 months ago
.gitattributes
1.57 kB
Upload folder using huggingface_hub
11 months ago
README.md
5.1 kB
Upload folder using huggingface_hub
11 months ago
adapter_config.json
801 Bytes
Upload folder using huggingface_hub
11 months ago
adapter_model.safetensors
432 MB
xet
Upload folder using huggingface_hub
11 months ago
optimizer.pt
220 MB
xet
Upload folder using huggingface_hub
11 months ago
rng_state.pth
14.3 kB
xet
Upload folder using huggingface_hub
11 months ago
scheduler.pt
1.06 kB
xet
Upload folder using huggingface_hub
11 months ago
special_tokens_map.json
636 Bytes
Upload folder using huggingface_hub
11 months ago
tokenizer.json
34.4 MB
xet
Upload folder using huggingface_hub
11 months ago
tokenizer.model
4.24 MB
xet
Upload folder using huggingface_hub
11 months ago
tokenizer_config.json
47 kB
Upload folder using huggingface_hub
11 months ago
trainer_state.json
1.25 kB
Upload folder using huggingface_hub
11 months ago
training_args.bin
6.2 kB
xet
Upload folder using huggingface_hub
11 months ago