File size: 1,655 Bytes
485908e 52be699 485908e 52be699 485908e 52be699 485908e 52be699 485908e 52be699 485908e 52be699 485908e 52be699 485908e 52be699 485908e 52be699 485908e 52be699 485908e 52be699 485908e 52be699 485908e 52be699 485908e 52be699 485908e 52be699 485908e 52be699 485908e 52be699 485908e 52be699 485908e 52be699 485908e 52be699 485908e 52be699 485908e 52be699 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 |
---
base_model: sentence-transformers/all-MiniLM-L6-v2
library_name: peft
license: mit
tags:
- lora
- peft
- finance
- financial
- economics
- domain-adaptation
- sentence-embeddings
language:
- en
---
# Finance LoRA Adapter for DomainEmbedder-v2.6
Domain-specific LoRA adapter for finance/economics text embeddings.
## Model Details
| Property | Value |
|----------|-------|
| **Base Model** | sentence-transformers/all-MiniLM-L6-v2 |
| **Parent System** | DomainEmbedder-v2.6 |
| **Domain** | Finance / Economics |
| **LoRA Rank** | 16 |
| **LoRA Alpha** | 32 |
| **Target Modules** | query, value |
| **Trainable Params** | 147,456 (0.645%) |
## Training Data
Trained on 40,000 finance text pairs from:
- Finance Alpaca
- FinGPT-FiQA
- Financial QA
## Training Configuration
| Parameter | Value |
|-----------|-------|
| Epochs | 3 |
| Batch Size | 32 |
| Learning Rate | 2e-4 |
| Loss | Contrastive (InfoNCE) |
| Best Val Loss | 0.0033 |
## Performance
Finance domain achieved the highest accuracy in the DomainEmbedder system:
- Training Accuracy: 78.0%
- Improvement over base: +78.0%
## Usage
This adapter is part of the DomainEmbedder-v2.6 system. It is selected automatically by the RL policy when financial content is detected.
```python
from peft import PeftModel
from transformers import AutoModel
# Load base encoder
base_encoder = AutoModel.from_pretrained('sentence-transformers/all-MiniLM-L6-v2')
# Apply finance LoRA
finance_model = PeftModel.from_pretrained(base_encoder, 'path/to/finance_lora')
```
## Author
**Zain Asad**
## License
MIT License
## Framework Versions
- PEFT 0.18.1
- Transformers 4.x
- PyTorch 2.x
|