File size: 2,331 Bytes
7572a81
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
# Forge Coder v1.21.11

A specialized code generation model fine-tuned for Minecraft Forge mod development.

## Model Details

| Property | Value |
|----------|-------|
| **Base Model** | deepseek-ai/deepseek-coder-6.7b-instruct |
| **Fine-tuning Method** | LoRA (r=64, alpha=128) |
| **Trainable Parameters** | 159.9M (2.3% of total) |
| **Forge Version** | 1.21.11 |
| **Minecraft Version** | 1.21.11 |
| **MCP Version** | 20251209.095502 |

## Training Data

- **Source Code**: 27 popular Forge mod repositories
- **Documentation**: Official Forge documentation
- **Total Java Files**: 22,916
- **Training Samples**: 13,936
- **Validation Samples**: 734

### Included Mods
Applied Energistics 2, JustEnoughItems, TerraFirmaCraft, Mekanism, Create, 
Thermal Expansion/Foundation, RFTools, Botania, Quark, Tinkers' Construct, 
Immersive Engineering, Twilight Forest, and more.

## Training Metrics

| Metric | Value |
|--------|-------|
| Training Time | 9h 12m |
| Final Train Loss | 0.27 |
| Final Eval Loss | 0.325 |
| Token Accuracy | 92.5% |
| Epochs | 3 |

## Capabilities

The model is specialized in:
- Block and Item creation
- Entity programming
- GUI/Screen development
- Network packet handling
- World generation
- Event handling
- Registry systems
- Capability API
- Recipe systems
- Rendering code
- Data generation

## Usage

```python
from transformers import AutoModelForCausalLM, AutoTokenizer
from peft import PeftModel

base_model = "deepseek-ai/deepseek-coder-6.7b-instruct"
adapter_path = "path/to/forge-coder-v1.21.11"

tokenizer = AutoTokenizer.from_pretrained(adapter_path)
model = AutoModelForCausalLM.from_pretrained(base_model, torch_dtype=torch.bfloat16)
model = PeftModel.from_pretrained(model, adapter_path)

prompt = """### System:
You are an expert Minecraft Forge mod developer.

### User:
Write a simple custom block class for Minecraft Forge 1.21.11

### Assistant:
"""

inputs = tokenizer(prompt, return_tensors="pt")
outputs = model.generate(**inputs, max_new_tokens=512)
print(tokenizer.decode(outputs[0]))
```

## Version History

- **v1.21.11** (2024-12-18): Initial release for MC 1.21.11 / Forge 1.21.11

## License

This model is released under the same license as the base model (deepseek-coder).
Training data sourced from open-source repositories under various permissive licenses.