File size: 1,136 Bytes
9a08bcc
 
 
 
 
 
 
 
 
9b86504
9a08bcc
 
 
 
 
9b86504
9a08bcc
9b86504
9a08bcc
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
9b86504
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
# CodeWeave-LlamaCode

A fine-tuned code generation model built on Meta's CodeLlama foundation, specialized for enterprise code development.

## Model Description

CodeWeave-LlamaCode extends **codellama/CodeLlama-7b-Instruct-hf** with domain-specific fine-tuning for improved code quality and developer productivity.

### Base Model

- **Foundation**: CodeLlama-7b-Instruct-hf from Meta AI
- **Architecture**: Llama 2-based transformer
- **Parameters**: 7B

### Training Data

Fine-tuned using CodeWeave-Enterprise dataset containing:

- 40K enterprise code samples
- API integration patterns
- Security-focused code examples
- Documentation generation tasks

## Usage

```python
from transformers import AutoModelForCausalLM, AutoTokenizer

model = AutoModelForCausalLM.from_pretrained("toolevalxm/CodeWeave-LlamaCode")
tokenizer = AutoTokenizer.from_pretrained("toolevalxm/CodeWeave-LlamaCode")
```

## Evaluation Results

| Benchmark | Score |
|-----------|-------|
| HumanEval | 70.1% |
| MBPP | 65.8% |

## Acknowledgements

We thank Meta AI for developing the CodeLlama series.

**License**

The license for this model is llama2.