--- library_name: transformers pipeline_tag: text-generation tags: - merlina - grimoire - text-generation - sft datasets: - hemlang/hemlock-codex-SFT base_model: - hemlang/Hemlock2-Coder-7B --- ![image/png](https://huggingface.co/datasets/nbeerbower/cover-images/resolve/main/hemlock_kawaii.png) # Hemlock-Codex-7B ## Training Configuration | Parameter | Value | |-----------|-------| | Training Mode | SFT | | Base Model | `hemlang/Hemlock2-Coder-7B` | | Learning Rate | 0.0001 | | Epochs | 3 | | Batch Size | 2 | | Gradient Accumulation | 16 | | Effective Batch Size | 32 | | Max Sequence Length | 8192 | | Optimizer | paged_adamw_8bit | | LR Scheduler | cosine | | Warmup Ratio | 0.05 | | Weight Decay | 0.01 | | Max Grad Norm | 0.25 | | Seed | 42 | | LoRA Rank (r) | 128 | | LoRA Alpha | 128 | | LoRA Dropout | 0.05 | | Target Modules | k_proj, o_proj, q_proj, v_proj, down_proj, gate_proj, up_proj | | Quantization | 4-bit (NF4) | | GPU | NVIDIA RTX A6000 | --- ![Trained with Merlina](https://raw.githubusercontent.com/Schneewolf-Labs/Merlina/refs/heads/main/frontend/madewithmerlina_smol.png) [Merlina on GitHub](https://github.com/Schneewolf-Labs/Merlina)