metadata
base_model: sentence-transformers/all-MiniLM-L6-v2
library_name: peft
license: mit
tags:
- lora
- peft
- legal
- law
- domain-adaptation
- sentence-embeddings
language:
- en
Legal LoRA Adapter for DomainEmbedder-v2.6
Domain-specific LoRA adapter for legal/law text embeddings.
Model Details
| Property | Value |
|---|---|
| Base Model | sentence-transformers/all-MiniLM-L6-v2 |
| Parent System | DomainEmbedder-v2.6 |
| Domain | Legal / Law |
| LoRA Rank | 16 |
| LoRA Alpha | 32 |
| Target Modules | query, value |
| Trainable Params | 147,456 (0.645%) |
Training Data
Trained on 40,000 legal text pairs from:
- EUR-LEX (European legislation)
- CaseHold (US case law)
- ECTHR-A (European Court of Human Rights)
- ECTHR-B (European Court of Human Rights)
Training Configuration
| Parameter | Value |
|---|---|
| Epochs | 3 |
| Batch Size | 32 |
| Learning Rate | 2e-4 |
| Loss | Contrastive (InfoNCE) |
| Best Val Loss | 0.0001 |
Usage
This adapter is part of the DomainEmbedder-v2.6 system. It is selected automatically by the RL policy when legal content is detected.
from peft import PeftModel
from transformers import AutoModel
# Load base encoder
base_encoder = AutoModel.from_pretrained('sentence-transformers/all-MiniLM-L6-v2')
# Apply legal LoRA
legal_model = PeftModel.from_pretrained(base_encoder, 'path/to/legal_lora')
Author
Zain Asad
License
MIT License
Framework Versions
- PEFT 0.18.1
- Transformers 4.x
- PyTorch 2.x