jan-ai commited on
Commit
e8773e5
·
verified ·
1 Parent(s): 2f6aeff

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +68 -1
README.md CHANGED
@@ -8,4 +8,71 @@ pipeline_tag: text-generation
8
  library_name: transformers
9
  tags:
10
  - agent
11
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
8
  library_name: transformers
9
  tags:
10
  - agent
11
+ ---
12
+ # Jan-Code-4B: a small code-tuned model based on Jan-v3
13
+
14
+ [![GitHub](https://img.shields.io/badge/GitHub-Repository-blue?logo=github)](https://github.com/janhq/jan)
15
+ [![License](https://img.shields.io/badge/License-Apache%202.0-yellow)](https://opensource.org/licenses/Apache-2.0)
16
+ [![Jan App](https://img.shields.io/badge/Powered%20by-Jan%20App-purple?style=flat\&logo=android)](https://jan.ai/)
17
+
18
+ ## Overview
19
+
20
+ **Jan-Code-4B** is a compact **code-tuned** model built on top of **Jan-v3-4B-base-instruct**. It’s designed to be a practical coding model you can run locally and iterate on quickly—useful for everyday code tasks and as a lightweight “worker” model in agentic workflows.
21
+
22
+ Compared to larger coding models, Jan-Code focuses on handling **well-scoped subtasks** reliably (edits, refactors, tests, debugging help) while keeping latency and compute requirements small.
23
+
24
+ ## Intended Use
25
+
26
+ * **Lightweight coding assistant** for generation, editing, refactoring, and debugging
27
+ * **A small, fast worker model** for agent setups (e.g., as a sub-agent that produces patches/tests while a larger model plans)
28
+ * **A strong starting point for code-focused fine-tunes** on internal repos or domain codebases
29
+
30
+
31
+ ## Performance
32
+
33
+
34
+ ## Quick Start
35
+
36
+ ### Integration with Jan Apps
37
+
38
+ Jan-v3 demo is hosted on **Jan Browser** at **[chat.jan.ai](https://chat.jan.ai/)**. It is also optimized for direct integration with [Jan Desktop](https://jan.ai/), select the model in the app to start using it.
39
+
40
+
41
+ ### Local Deployment
42
+
43
+ **Using vLLM:**
44
+ ```bash
45
+ vllm serve janhq/Jan-code \
46
+ --host 0.0.0.0 \
47
+ --port 1234 \
48
+ --enable-auto-tool-choice \
49
+ --tool-call-parser hermes
50
+
51
+ ```
52
+
53
+ **Using llama.cpp:**
54
+ ```bash
55
+ llama-server --model Jan-code-Q8_0.gguf \
56
+ --host 0.0.0.0 \
57
+ --port 1234 \
58
+ --jinja \
59
+ --no-context-shift
60
+ ```
61
+
62
+ ### Recommended Parameters
63
+ For optimal performance in agentic and general tasks, we recommend the following inference parameters:
64
+ ```yaml
65
+ temperature: 0.7
66
+ top_p: 0.8
67
+ top_k: 20
68
+ ```
69
+
70
+ ## 🤝 Community & Support
71
+
72
+ - **Discussions**: [Hugging Face Community](https://huggingface.co/janhq/Jan-code/discussions)
73
+ - **Jan App**: Learn more about the Jan App at [jan.ai](https://jan.ai/)
74
+
75
+ ## 📄 Citation
76
+ ```bibtex
77
+ Updated Soon
78
+ ```