File size: 2,517 Bytes
bbf485d
 
 
974f4d8
 
 
 
 
 
 
 
bbf485d
 
974f4d8
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
deb1e1c
974f4d8
 
deb1e1c
 
974f4d8
 
 
deb1e1c
974f4d8
 
deb1e1c
 
974f4d8
deb1e1c
 
974f4d8
 
deb1e1c
974f4d8
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
---
base_model: meta-llama/Llama-3.2-3B
library_name: peft
license: apache-2.0
datasets:
- trumancai/revela_code_training_corpus
language:
- en
tags:
- retrieval
- code
---

# Model Summary

**Revela-code-3b** is a self-supervised code retriever built on the 3 B-parameter **LLaMA-3.2-3B** backbone.  
It was trained on 358 K batches of code-centric text (Stack Overflow, tutorials, API docs) using the Revela next-token-prediction + in-batch attention objective.  
Use it for code-search, question-answer navigation, or hybrid doc/code retrieval.

- **Repository:** [TRUMANCFY/Revela](https://github.com/TRUMANCFY/Revela)  
- **Training Dataset:** [trumancai/revela_code_training_corpus](https://huggingface.co/datasets/trumancai/revela_code_training_corpus)

# Other Links
| Binary | Description |
|:-------|:------------|
| [trumancai/Revela-code-3b](https://huggingface.co/trumancai/Revela-code-3b) | 3 B-parameter code-retriever. |
| [trumancai/Revela-code-1b](https://huggingface.co/trumancai/Revela-code-1b) | 1 B-parameter code-retriever. |
| [trumancai/Revela-code-500M](https://huggingface.co/trumancai/Revela-code-500M) | 500 M-parameter code-retriever. |
| [trumancai/Revela-3b](https://huggingface.co/trumancai/Revela-3b) | 3 B-parameter Wikipedia retriever. |
| [trumancai/Revela-1b](https://huggingface.co/trumancai/Revela-1b) | 1 B-parameter Wikipedia retriever. |
| [trumancai/Revela-500M](https://huggingface.co/trumancai/Revela-500M) | 500 M-parameter Wikipedia retriever. |
| [trumancai/revela_code_training_corpus](https://huggingface.co/datasets/trumancai/revela_code_training_corpus) | Code training corpus. |
| [trumancai/revela_training_corpus](https://huggingface.co/datasets/trumancai/revela_training_corpus) | Wikipedia training corpus. |

# Usage
```python
from mteb.model_meta import ModelMeta
from mteb.models.repllama_models import RepLLaMAWrapper, _loader
import mteb, torch

revela_llama_code_3b = ModelMeta(
    loader=_loader(
        RepLLaMAWrapper,
        base_model_name_or_path="meta-llama/Llama-3.2-3B",
        peft_model_name_or_path="trumancai/Revela-code-3b",
        device_map="auto",
        torch_dtype=torch.bfloat16,
    ),
    name="trumancai/Revela-code-3b",
    languages=["eng_Latn"],
    open_source=True,
    revision="974f4d8e7ff5d5439cc1863088948249f612c284",
    release_date="2025-10-07",
)

model = revela_llama_code_3b.loader()

mteb.MTEB(tasks=["AppsRetrieval"])
    .run(model=model, output_folder="results/Revela-code-3b")
```

# License

# Citation