YAML Metadata Warning:empty or missing yaml metadata in repo card
Check out the documentation for more information.
Quantization made by Richard Erkhov.
mistral-7b-openhermes-2.5-sft - bnb 4bits
- Model creator: https://huggingface.co/CorticalStack/
- Original model: https://huggingface.co/CorticalStack/mistral-7b-openhermes-2.5-sft/
Original model description:
license: apache-2.0
mistral-7b-openhermes-2.5-sft
mistral-7b-openhermes-2.5-sft is an SFT fine-tuned version of unsloth/mistral-7b-bnb-4bit using the teknium/OpenHermes-2.5 dataset.
Fine-tuning configuration
LoRA
- r: 256
- LoRA alpha: 128
- LoRA dropout: 0.0
Training arguments
- Epochs: 1
- Batch size: 4
- Gradient accumulation steps: 6
- Optimizer: adamw_torch_fused
- Max steps: 100
- Learning rate: 0.0002
- Weight decay: 0.1
- Learning rate scheduler type: linear
- Max seq length: 2048
- 4-bit bnb: True
Trained with Unsloth and Huggingface's TRL library.
- Downloads last month
- -
Inference Providers NEW
This model isn't deployed by any Inference Provider. 馃檵 Ask for provider support
