File size: 1,846 Bytes
ff416d2
985ade5
ff416d2
 
 
 
 
ca05a2e
985ade5
ff416d2
 
985ade5
ff416d2
985ade5
ff416d2
985ade5
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
ff416d2
985ade5
 
 
 
 
ff416d2
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
---
title: Full-Stack Fine-Tuning for Q
emoji: 🧠
colorFrom: yellow
colorTo: indigo
sdk: static
pinned: false
license: mit
short_description: Full-Stack Fine-Tuning for the Q Programming Language
---

# Full-Stack Fine-Tuning for the Q Programming Language

This is the project page for "Full-Stack Fine-Tuning for the Q Programming Language" - a comprehensive approach to adapting large language models for specialized domains.

## Project Overview

We present an end-to-end methodology for adapting LLMs to the Q programming language, a specialized tool used in quantitative finance. Our approach includes:

- **Dataset Construction**: LeetCode-style evaluation benchmark for Q
- **Domain-Adaptive Pretraining**: Training on curated Q code repositories  
- **Supervised Fine-Tuning**: Multi-task training on Q programming challenges
- **Reinforcement Learning**: Programmatic reward optimization

Our best model achieves 59% pass@1 accuracy, surpassing Claude Opus-4 by 29.5%.

## Links

- **Paper**: [Coming Soon - ArXiv Link]
- **Code**: [Coming Soon - GitHub Repository] 
- **Models**: [Coming Soon - HuggingFace Collection]

If you find this work useful, please cite:
```
@article{hogan2024fullstack,
  author    = {Hogan, Brendan R. and Brown, Will and Boyarsky, Adel and Schneider, Anderson and Nevmyvaka, Yuriy},
  title     = {Full-Stack Fine-Tuning for the Q Programming Language},
  journal   = {arXiv preprint},
  year      = {2024},
}
```

# Website License
<a rel="license" href="http://creativecommons.org/licenses/by-sa/4.0/"><img alt="Creative Commons License" style="border-width:0" src="https://i.creativecommons.org/l/by-sa/4.0/88x31.png" /></a><br />This work is licensed under a <a rel="license" href="http://creativecommons.org/licenses/by-sa/4.0/">Creative Commons Attribution-ShareAlike 4.0 International License</a>.