|
|
--- |
|
|
base_model: sentence-transformers/all-MiniLM-L6-v2 |
|
|
library_name: peft |
|
|
license: mit |
|
|
tags: |
|
|
- lora |
|
|
- peft |
|
|
- finance |
|
|
- financial |
|
|
- economics |
|
|
- domain-adaptation |
|
|
- sentence-embeddings |
|
|
language: |
|
|
- en |
|
|
--- |
|
|
|
|
|
# Finance LoRA Adapter for DomainEmbedder-v2.6 |
|
|
|
|
|
Domain-specific LoRA adapter for finance/economics text embeddings. |
|
|
|
|
|
## Model Details |
|
|
|
|
|
| Property | Value | |
|
|
|----------|-------| |
|
|
| **Base Model** | sentence-transformers/all-MiniLM-L6-v2 | |
|
|
| **Parent System** | DomainEmbedder-v2.6 | |
|
|
| **Domain** | Finance / Economics | |
|
|
| **LoRA Rank** | 16 | |
|
|
| **LoRA Alpha** | 32 | |
|
|
| **Target Modules** | query, value | |
|
|
| **Trainable Params** | 147,456 (0.645%) | |
|
|
|
|
|
## Training Data |
|
|
|
|
|
Trained on 40,000 finance text pairs from: |
|
|
- Finance Alpaca |
|
|
- FinGPT-FiQA |
|
|
- Financial QA |
|
|
|
|
|
## Training Configuration |
|
|
|
|
|
| Parameter | Value | |
|
|
|-----------|-------| |
|
|
| Epochs | 3 | |
|
|
| Batch Size | 32 | |
|
|
| Learning Rate | 2e-4 | |
|
|
| Loss | Contrastive (InfoNCE) | |
|
|
| Best Val Loss | 0.0033 | |
|
|
|
|
|
## Performance |
|
|
|
|
|
Finance domain achieved the highest accuracy in the DomainEmbedder system: |
|
|
- Training Accuracy: 78.0% |
|
|
- Improvement over base: +78.0% |
|
|
|
|
|
## Usage |
|
|
|
|
|
This adapter is part of the DomainEmbedder-v2.6 system. It is selected automatically by the RL policy when financial content is detected. |
|
|
|
|
|
```python |
|
|
from peft import PeftModel |
|
|
from transformers import AutoModel |
|
|
|
|
|
# Load base encoder |
|
|
base_encoder = AutoModel.from_pretrained('sentence-transformers/all-MiniLM-L6-v2') |
|
|
|
|
|
# Apply finance LoRA |
|
|
finance_model = PeftModel.from_pretrained(base_encoder, 'path/to/finance_lora') |
|
|
``` |
|
|
|
|
|
## Author |
|
|
|
|
|
**Zain Asad** |
|
|
|
|
|
## License |
|
|
|
|
|
MIT License |
|
|
|
|
|
## Framework Versions |
|
|
|
|
|
- PEFT 0.18.1 |
|
|
- Transformers 4.x |
|
|
- PyTorch 2.x |
|
|
|