File size: 3,802 Bytes
3bde31d ceee9a0 3bde31d ceee9a0 3bde31d ceee9a0 3bde31d ceee9a0 3bde31d ceee9a0 3bde31d ceee9a0 3bde31d ceee9a0 3bde31d ceee9a0 3bde31d ceee9a0 3bde31d ceee9a0 3bde31d ceee9a0 3bde31d ceee9a0 3bde31d ceee9a0 3bde31d ceee9a0 3bde31d ceee9a0 3bde31d ceee9a0 3bde31d ceee9a0 3bde31d ceee9a0 3bde31d ceee9a0 3bde31d ceee9a0 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 |
---
license: apache-2.0
language:
- en
library_name: transformers
pipeline_tag: text-generation
tags:
- pinescript
- tradingview
- code-generation
- finance
- trading
- codegemma
- fine-tuned
base_model: google/codegemma-7b-it
datasets:
- anthonym21/pinescript-v5-instructions
model-index:
- name: pinescript-v5-instructions-merged
results: []
---
# PineScript v5 Code Generator
A fine-tuned CodeGemma 7B model specialized in generating TradingView PineScript v5 code for trading indicators, strategies, and libraries.
## Model Description
This model was fine-tuned on 5,000+ PineScript v5 examples from the [PineScripts-Permissive](https://huggingface.co/datasets/mrmegatelo/PineScripts-Permissive) dataset, which contains high-quality, permissively-licensed scripts from TradingView.
- **Base Model:** [google/codegemma-7b-it](https://huggingface.co/google/codegemma-7b-it)
- **Fine-tuning Method:** QLoRA (4-bit quantization + LoRA adapters)
- **Training Data:** 4,774 instruction/response pairs
- **Context Length:** 4096 tokens
## Intended Use
Generate PineScript v5 code for:
- Technical indicators (RSI, MACD, Bollinger Bands, custom indicators)
- Trading strategies with backtesting
- Reusable libraries
- Alert conditions
- Custom visualizations
## Usage
```python
from transformers import AutoModelForCausalLM, AutoTokenizer
import torch
model_id = "anthonym21/pinescript-v5-instructions-merged"
tokenizer = AutoTokenizer.from_pretrained(model_id)
model = AutoModelForCausalLM.from_pretrained(
model_id,
torch_dtype=torch.bfloat16,
device_map="auto",
)
def generate_pinescript(prompt, max_tokens=1024):
formatted = f"### Human: {prompt}\n### Assistant:"
inputs = tokenizer(formatted, return_tensors="pt").to(model.device)
outputs = model.generate(
**inputs,
max_new_tokens=max_tokens,
temperature=0.7,
top_p=0.9,
do_sample=True,
pad_token_id=tokenizer.eos_token_id,
)
response = tokenizer.decode(outputs[0], skip_special_tokens=True)
return response.split("### Assistant:")[-1].strip()
# Example
code = generate_pinescript("Write a PineScript v5 RSI indicator with overbought/oversold zones")
print(code)
```
## Example Prompts
| Prompt | Description |
|--------|-------------|
| "Write a PineScript v5 indicator that shows RSI with dynamic overbought/oversold levels" | RSI with adaptive levels |
| "Create a MACD crossover strategy with stop loss and take profit" | Complete trading strategy |
| "Write a Bollinger Bands indicator with squeeze detection" | Volatility indicator |
| "Create a multi-timeframe moving average indicator" | MTF analysis tool |
## Training Details
| Parameter | Value |
|-----------|-------|
| Epochs | 3 |
| Batch Size | 2 |
| Gradient Accumulation | 8 |
| Learning Rate | 2e-4 |
| LoRA r | 64 |
| LoRA alpha | 128 |
| Max Seq Length | 4096 |
| Quantization | 4-bit (nf4) |
## Limitations
- Generates PineScript v5 syntax; may not be compatible with older versions
- Code should be reviewed and tested before live trading
- Complex multi-indicator strategies may require refinement
- Does not provide financial advice
## Dataset
Training data sourced from [mrmegatelo/PineScripts-Permissive](https://huggingface.co/datasets/mrmegatelo/PineScripts-Permissive):
- 5,848 PineScript v5 scripts
- Filtered to 4,774 high-quality examples
- Includes indicators, strategies, and libraries
- All scripts under permissive licenses (MPL-2.0, Apache-2.0)
## Citation
```bibtex
@misc{pinescript-v5-instructions-merged,
author = {Anthony Maio},
title = {PineScript v5 Code Generator},
year = {2025},
publisher = {HuggingFace},
url = {https://huggingface.co/anthonym21/pinescript-v5-instructions-merged}
}
```
## License
Apache 2.0 (same as base model)
|