File size: 1,540 Bytes
485908e
 
 
c6df80d
485908e
 
c6df80d
 
 
 
 
 
 
 
485908e
 
c6df80d
485908e
c6df80d
485908e
 
 
c6df80d
 
 
 
 
 
 
 
 
485908e
c6df80d
485908e
c6df80d
 
 
 
 
485908e
c6df80d
485908e
c6df80d
 
 
 
 
 
 
485908e
c6df80d
485908e
c6df80d
485908e
c6df80d
 
 
485908e
c6df80d
 
485908e
c6df80d
 
 
485908e
c6df80d
485908e
c6df80d
485908e
c6df80d
485908e
c6df80d
485908e
c6df80d
485908e
c6df80d
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
---
base_model: sentence-transformers/all-MiniLM-L6-v2
library_name: peft
license: mit
tags:
- lora
- peft
- code
- programming
- software
- domain-adaptation
- sentence-embeddings
language:
- en
---

# Code LoRA Adapter for DomainEmbedder-v2.6

Domain-specific LoRA adapter for code/programming text embeddings.

## Model Details

| Property | Value |
|----------|-------|
| **Base Model** | sentence-transformers/all-MiniLM-L6-v2 |
| **Parent System** | DomainEmbedder-v2.6 |
| **Domain** | Code / Programming |
| **LoRA Rank** | 16 |
| **LoRA Alpha** | 32 |
| **Target Modules** | query, value |
| **Trainable Params** | 147,456 (0.645%) |

## Training Data

Trained on 40,000 code-related text pairs from:
- Code Alpaca
- MBPP (Mostly Basic Python Problems)
- Code Contests
- Python Instructions

## Training Configuration

| Parameter | Value |
|-----------|-------|
| Epochs | 3 |
| Batch Size | 32 |
| Learning Rate | 2e-4 |
| Loss | Contrastive (InfoNCE) |
| Best Val Loss | 0.0039 |

## Usage

This adapter is part of the DomainEmbedder-v2.6 system. It is selected automatically by the RL policy when code-related content is detected.

```python
from peft import PeftModel
from transformers import AutoModel

# Load base encoder
base_encoder = AutoModel.from_pretrained('sentence-transformers/all-MiniLM-L6-v2')

# Apply code LoRA
code_model = PeftModel.from_pretrained(base_encoder, 'path/to/code_lora')
```

## Author

**Zain Asad**

## License

MIT License

## Framework Versions

- PEFT 0.18.1
- Transformers 4.x
- PyTorch 2.x