File size: 697 Bytes
ceca9fb 21addf0 6887f87 8b6f0ac | 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 | ---
language:
- en
base_model:
- microsoft/Phi-4-mini-instruct
pipeline_tag: text-generation
tags:
- professional
- linkedin
---
* Parameter-efficient, instruction-fine-tuned phi-4-mini model.
* Uses LoRA for fine-tuning.
* Trained on LinkedIn posts from various themes.
* Training details:
<ul>
<li>Training size: 2643</li>
<li>Quantization: 8-bit</li>
<li>Optimizer: AdamW</li>
<li>Learning Rate: 1e-4</li>
<li>Epochs: 1</li>
<li>Train Batch size: 1</li>
<li>Eval Batch size: 4</li>
<li>Gradient accumulation steps: 8</li>
<li>Sequence length: 412</li>
</ul>
* LoRA configs:
<ul>
<li>Rank: 16</li>
<li>Alpha: 16</li>
<li>Dropout: 0.05</li>
</ul>
|