File size: 1,522 Bytes
481db64
a30352c
481db64
a30352c
 
 
 
 
 
 
 
 
 
 
 
481db64
 
a30352c
481db64
a30352c
481db64
a30352c
481db64
a30352c
481db64
a30352c
 
 
 
 
 
 
481db64
a30352c
481db64
a30352c
 
 
 
 
 
 
 
 
481db64
a30352c
481db64
a30352c
 
 
481db64
a30352c
 
 
 
481db64
a30352c
 
481db64
a30352c
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
---
license: mit
tags:
- jekyll
- static-site
- code-generation
- ruby
- liquid
- web-development
base_model: deepseek-ai/deepseek-coder-1.3b-instruct
datasets:
- daffaaditya/jekyll-master-dataset
language:
- en
- id
---

# 🎯 Jekyll Master AI

Fine-tuned DeepSeek-Coder model specialized in Jekyll static site generator.

## Model Description

This model is fine-tuned from DeepSeek-Coder-1.3B to become an expert in Jekyll, a static site generator written in Ruby.

**Specializations:**
- Liquid templating language
- YAML configuration files (_config.yml)
- Jekyll plugins development
- Sass/SCSS styling
- GitHub Pages deployment
- SEO optimization

## Training Data

The model was fine-tuned on 192 examples covering:
- Configuration files (15%)
- Layouts & templates (20%)
- Includes & components (15%)
- Plugins (10%)
- Sass/SCSS (15%)
- Liquid filters (10%)
- Deployment configs (10%)
- Front matter & Data files (5%)

## Usage

```python
from transformers import AutoTokenizer, AutoModelForCausalLM
from peft import PeftModel

# Load base model
base_model = "deepseek-ai/deepseek-coder-1.3b-instruct"
model = AutoModelForCausalLM.from_pretrained(base_model)
tokenizer = AutoTokenizer.from_pretrained(base_model)

# Load adapter
model = PeftModel.from_pretrained(model, "daffaaditya/jekyll-master-ai")

# Generate code
prompt = "Buat file _config.yml untuk blog teknologi"
inputs = tokenizer(prompt, return_tensors="pt")
outputs = model.generate(**inputs, max_length=500)
print(tokenizer.decode(outputs[0]))
```