|
|
--- |
|
|
license: mit |
|
|
tags: |
|
|
- jekyll |
|
|
- static-site |
|
|
- code-generation |
|
|
- ruby |
|
|
- liquid |
|
|
- web-development |
|
|
base_model: deepseek-ai/deepseek-coder-1.3b-instruct |
|
|
datasets: |
|
|
- daffaaditya/jekyll-master-dataset |
|
|
language: |
|
|
- en |
|
|
- id |
|
|
--- |
|
|
|
|
|
# 🎯 Jekyll Master AI |
|
|
|
|
|
Fine-tuned DeepSeek-Coder model specialized in Jekyll static site generator. |
|
|
|
|
|
## Model Description |
|
|
|
|
|
This model is fine-tuned from DeepSeek-Coder-1.3B to become an expert in Jekyll, a static site generator written in Ruby. |
|
|
|
|
|
**Specializations:** |
|
|
- Liquid templating language |
|
|
- YAML configuration files (_config.yml) |
|
|
- Jekyll plugins development |
|
|
- Sass/SCSS styling |
|
|
- GitHub Pages deployment |
|
|
- SEO optimization |
|
|
|
|
|
## Training Data |
|
|
|
|
|
The model was fine-tuned on 192 examples covering: |
|
|
- Configuration files (15%) |
|
|
- Layouts & templates (20%) |
|
|
- Includes & components (15%) |
|
|
- Plugins (10%) |
|
|
- Sass/SCSS (15%) |
|
|
- Liquid filters (10%) |
|
|
- Deployment configs (10%) |
|
|
- Front matter & Data files (5%) |
|
|
|
|
|
## Usage |
|
|
|
|
|
```python |
|
|
from transformers import AutoTokenizer, AutoModelForCausalLM |
|
|
from peft import PeftModel |
|
|
|
|
|
# Load base model |
|
|
base_model = "deepseek-ai/deepseek-coder-1.3b-instruct" |
|
|
model = AutoModelForCausalLM.from_pretrained(base_model) |
|
|
tokenizer = AutoTokenizer.from_pretrained(base_model) |
|
|
|
|
|
# Load adapter |
|
|
model = PeftModel.from_pretrained(model, "daffaaditya/jekyll-master-ai") |
|
|
|
|
|
# Generate code |
|
|
prompt = "Buat file _config.yml untuk blog teknologi" |
|
|
inputs = tokenizer(prompt, return_tensors="pt") |
|
|
outputs = model.generate(**inputs, max_length=500) |
|
|
print(tokenizer.decode(outputs[0])) |
|
|
``` |
|
|
|