wpostma commited on
Commit
77f694b
·
verified ·
1 Parent(s): 3d69154

Upload README.md with huggingface_hub

Browse files
Files changed (1) hide show
  1. README.md +61 -0
README.md ADDED
@@ -0,0 +1,61 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: other
3
+ license_name: deepseek
4
+ base_model: deepseek-ai/deepseek-coder-6.7b-instruct
5
+ tags:
6
+ - delphi
7
+ - objectpascal
8
+ - code-assistant
9
+ - system-prompt
10
+ language:
11
+ - en
12
+ ---
13
+
14
+ # deepseek-coder-6.7b-instruct-de1
15
+
16
+ **A Delphi/ObjectPascal coding assistant built on deepseek-coder-6.7b-instruct.**
17
+
18
+ 13-line FATPIE system prompt, 4096 context. Proven 127/136 on smoketest.
19
+
20
+ ## What is this?
21
+
22
+ This is **not a fine-tuned model**. It uses the unmodified deepseek-coder-6.7b-instruct
23
+ weights with a carefully crafted system prompt that activates the model's existing
24
+ knowledge of Delphi/ObjectPascal conventions.
25
+
26
+ Six QLoRA fine-tuning attempts on this base model either had zero effect or caused
27
+ regression. The system prompt alone produces better results than any fine-tuned variant.
28
+
29
+ ## How to use
30
+
31
+ ### With Ollama
32
+
33
+ ```bash
34
+ # Download the Modelfile
35
+ # Then create the model:
36
+ ollama create deepseek-coder:6.7b-instruct-de1 -f Modelfile.de1
37
+
38
+ # Run it:
39
+ ollama run deepseek-coder:6.7b-instruct-de1 "Write a Delphi function that reverses a string"
40
+ ```
41
+
42
+ ### What the system prompt teaches
43
+
44
+ - **Memory management**: No ARC or GC on Windows. Free every object you create.
45
+ - **FATPIE naming**: T for types, F for fields, A for parameters, P for pointers, I for interfaces, E for exceptions
46
+ - **Code style**: Use Result not FunctionName, begin not BEGIN, try/finally for cleanup
47
+ - **Platform**: Delphi on Windows, MSBuild for builds, TDateTime for dates
48
+
49
+ ## Model details
50
+
51
+ - **Base model**: [deepseek-ai/deepseek-coder-6.7b-instruct](https://huggingface.co/deepseek-ai/deepseek-coder-6.7b-instruct)
52
+ - **Weights**: Unmodified — no LoRA, no fine-tuning
53
+ - **Method**: System prompt engineering only
54
+ - **Author**: Warren Postma (warren.postma@gmail.com)
55
+ - **Project**: [WARP — local AI agent for Delphi/ObjectPascal](https://github.com/wpostma)
56
+
57
+ ## Key finding
58
+
59
+ > A well-crafted system prompt on an unmodified 6.7B model beats QLoRA fine-tuning
60
+ > with 500-1000 curated instruction pairs. The model already knows Delphi — it just
61
+ > needs the right context to access that knowledge.