Made cleaner formating for model card

#2
Files changed (1) hide show
  1. README.md +47 -81
README.md CHANGED
@@ -1,24 +1,13 @@
1
  # Ginie — Smart Contract LLM
2
 
3
- The first AI model purpose-built to generate, compile, audit, and deploy smart contracts across institutional and public blockchains. Plain English in. Production-ready contract out. On-chain in under 90 seconds.
4
-
5
  [![Website](https://img.shields.io/badge/Website-ginie.xyz-blue)](https://ginie.xyz)
6
  [![npm](https://img.shields.io/badge/npm-30k%2B_weekly_downloads-red)](https://npmjs.com/package/ginie-sdk)
7
  [![License](https://img.shields.io/badge/License-MIT-green)](https://opensource.org/licenses/MIT)
8
  [![Demo](https://img.shields.io/badge/Demo-Live-brightgreen)](https://huggingface.co/spaces/GinieAI/Ginie-Demo)
9
- [![Canton](https://img.shields.io/badge/Canton_Network-Supported-purple)](https://canton.network)
10
-
11
- ---
12
-
13
- ## What is Ginie?
14
-
15
- Ginie is the developer layer for the next generation of on-chain applications. The friction keeping developers off-chain is not the blockchain itself — it is the specialised languages, compiler toolchains, and security requirements that sit between an idea and a deployed contract. Ginie removes all of that.
16
 
17
- Write a description. Get a contract that compiles, passes security checks, and deploys across Solidity (Ethereum, Avalanche, Camp Network), Daml (Canton Network), and Rust (Vara Network).
18
 
19
- Canton Network alone processes $6 trillion in tokenised assets, backed by Goldman Sachs, JPMorgan, and DTCC. Every institution building on it needs smart contracts. Ginie writes them.
20
-
21
- ---
22
 
23
  ## Quickstart
24
 
@@ -30,16 +19,16 @@ tokenizer = AutoTokenizer.from_pretrained("GinieAI/Solidity-LLM")
30
  model = AutoModelForCausalLM.from_pretrained(
31
  "GinieAI/Solidity-LLM",
32
  torch_dtype=torch.bfloat16,
33
- device_map="cuda"
34
  )
35
 
36
  prompt = """### Instruction:
37
- Write a Solidity ERC20 token contract with minting, burning, and owner controls.
38
 
39
  ### Response:
40
  """
41
 
42
- inputs = tokenizer(prompt, return_tensors="pt").to("cuda")
43
  outputs = model.generate(
44
  **inputs,
45
  max_new_tokens=800,
@@ -49,48 +38,43 @@ outputs = model.generate(
49
  )
50
 
51
  print(tokenizer.decode(outputs[0], skip_special_tokens=True))
 
52
 
 
53
 
54
- npm SDK — 30,000+ active weekly downloads
55
-
56
  npm install ginie-sdk
 
57
 
58
-
59
  import { Ginie } from 'ginie-sdk'
60
 
61
  const ginie = new Ginie({ apiKey: 'your-key' })
62
 
63
  const contract = await ginie.generate({
64
- prompt: 'ERC20 token with vesting schedule for a startup',
65
  chain: 'ethereum',
66
  audit: true
67
  })
 
68
 
69
- console.log(contract.code)
70
- console.log(contract.securityScore)
71
- console.log(contract.compiled)
72
 
 
73
 
74
- Model Details
75
-
76
-
77
-
78
- |Property |Value |
79
- |--------------|-----------------------------|
80
- |Developer |[Ginie AI](https://ginie.xyz)|
81
- |Model type |Causal LM — Code Generation |
82
- |Parameters |2 Billion |
83
- |Architecture |32 Transformer blocks |
84
- |Context length|2048 tokens |
85
- |Precision |bfloat16 |
86
- |Tokenizer |GPT-2 |
87
- |Base model |Chain-GPT/Solidity-LLM |
88
- |License |MIT |
89
-
90
- Performance
91
- Evaluated against GPT-4o mini and DeepSeek-Coder-7B on 100 Solidity contract generation prompts. Compilation success and security assessed via Slither static analysis. OpenZeppelin compliance assessed against standard library usage patterns.
92
 
 
93
 
 
94
 
95
  |Metric |Ginie v1|GPT-4o mini|DeepSeek-Coder-7B|
96
  |-----------------------|--------|-----------|-----------------|
@@ -99,55 +83,37 @@ Evaluated against GPT-4o mini and DeepSeek-Coder-7B on 100 Solidity contract gen
99
  |Gas efficiency |**72%** |65% |63% |
100
  |Security score |**58%** |54% |51% |
101
 
102
- Ginie achieves the highest compilation rate despite being the smallest model in the comparison — a direct result of domain specialisation over general-purpose scale.
103
-
104
- What Ginie generates today
105
- ∙ ERC20, ERC721, ERC1155 token contracts
106
- ∙ DeFi protocols — staking, liquidity pools, yield farming
107
- ∙ DAO and governance contracts
108
- ∙ Multisig wallets and escrow agreements
109
- ∙ NFT marketplaces
110
- ∙ Automated compliance and audit loops
111
- Chains supported
112
-
113
 
 
114
 
115
- |Chain |Language|Status |
116
- |--------------|--------|----------|
117
- |Ethereum |Solidity|Live |
118
- |Avalanche |Solidity|Live |
119
- |Camp Network |Solidity|Live |
120
- |Canton Network|Daml |v3 roadmap|
121
- |Vara Network |Rust |v3 roadmap|
122
 
123
- Not suitable for
124
- ∙ Production deployment without expert review
125
- ∙ Formal legal contract auditing
126
- ∙ Non-code generation tasks
 
127
 
128
- Roadmap
129
 
 
130
 
 
 
 
 
 
 
131
 
132
- |Version |What ships |
133
- |----------|---------------------------------------------------------|
134
- |v1.0 (now)|Solidity generation — 2B params, 83% compile rate |
135
- |v2.0 |Expanded corpus — DISL + Zellic, 7,800+ training examples|
136
- |v3.0 |Daml and Rust support — Canton Network and Vara Network |
137
- |v4.0 |Data flywheel — weekly retraining on real user prompts |
138
 
139
- The v4 flywheel is the permanent moat. Every contract a user successfully generates becomes a training example for the next version. The model improves weekly from real usage — a data distribution no statically trained competitor can replicate.
140
 
141
- Training
142
- Ginie v1 is fine-tuned from Chain-GPT/Solidity-LLM using LoRA on a curated Solidity instruction dataset. Training focused on instruction-following quality, OpenZeppelin pattern adherence, and compilable output over raw token prediction.
143
- Security validation uses Slither static analysis. Compilation validation uses solc. Both are integrated into the generation pipeline — not just evaluation.
144
 
145
- License and Attribution
146
- Released under the MIT License.
147
- Built on Chain-GPT/Solidity-LLM by ChainGPT, which is fine-tuned from Salesforce/codegen-2B-multi. Full credit to the original authors. Ginie extends this work for the institutional blockchain ecosystem.
148
 
149
- About Ginie AI
150
- Ginie AI is building the developer layer for institutional blockchain. Backed by the Canton Foundation and supported by the Canton Network ecosystem — the institutional blockchain processing $6 trillion in tokenised assets with Goldman Sachs, JPMorgan, and DTCC.
151
- ginie.xyz · npm SDK · Live demo
152
 
153
- Smart contracts generated by Ginie should be reviewed by a qualified developer before production deployment. Security scores are indicative and do not constitute a formal audit.
 
1
  # Ginie — Smart Contract LLM
2
 
 
 
3
  [![Website](https://img.shields.io/badge/Website-ginie.xyz-blue)](https://ginie.xyz)
4
  [![npm](https://img.shields.io/badge/npm-30k%2B_weekly_downloads-red)](https://npmjs.com/package/ginie-sdk)
5
  [![License](https://img.shields.io/badge/License-MIT-green)](https://opensource.org/licenses/MIT)
6
  [![Demo](https://img.shields.io/badge/Demo-Live-brightgreen)](https://huggingface.co/spaces/GinieAI/Ginie-Demo)
 
 
 
 
 
 
 
7
 
8
+ Generate, compile, and deploy smart contracts from plain English. Built for Solidity, Daml (Canton Network), and Rust (Vara Network).
9
 
10
+ -----
 
 
11
 
12
  ## Quickstart
13
 
 
19
  model = AutoModelForCausalLM.from_pretrained(
20
  "GinieAI/Solidity-LLM",
21
  torch_dtype=torch.bfloat16,
22
+ device_map="auto"
23
  )
24
 
25
  prompt = """### Instruction:
26
+ Write a Solidity ERC20 token with minting, burning, and owner controls.
27
 
28
  ### Response:
29
  """
30
 
31
+ inputs = tokenizer(prompt, return_tensors="pt").to(model.device)
32
  outputs = model.generate(
33
  **inputs,
34
  max_new_tokens=800,
 
38
  )
39
 
40
  print(tokenizer.decode(outputs[0], skip_special_tokens=True))
41
+ ```
42
 
43
+ **npm SDK**
44
 
45
+ ```bash
 
46
  npm install ginie-sdk
47
+ ```
48
 
49
+ ```javascript
50
  import { Ginie } from 'ginie-sdk'
51
 
52
  const ginie = new Ginie({ apiKey: 'your-key' })
53
 
54
  const contract = await ginie.generate({
55
+ prompt: 'ERC20 token with vesting schedule',
56
  chain: 'ethereum',
57
  audit: true
58
  })
59
+ ```
60
 
61
+ -----
 
 
62
 
63
+ ## Model
64
 
65
+ |Property |Value |
66
+ |--------------|----------------------|
67
+ |Parameters |2B |
68
+ |Architecture |32 Transformer blocks |
69
+ |Context length|2048 tokens |
70
+ |Precision |bfloat16 |
71
+ |Tokenizer |GPT-2 |
72
+ |Base model |Chain-GPT/Solidity-LLM|
73
+ |License |MIT |
 
 
 
 
 
 
 
 
 
74
 
75
+ -----
76
 
77
+ ## Performance
78
 
79
  |Metric |Ginie v1|GPT-4o mini|DeepSeek-Coder-7B|
80
  |-----------------------|--------|-----------|-----------------|
 
83
  |Gas efficiency |**72%** |65% |63% |
84
  |Security score |**58%** |54% |51% |
85
 
86
+ Evaluated on 100 prompts. Security via Slither static analysis.
 
 
 
 
 
 
 
 
 
 
87
 
88
+ -----
89
 
90
+ ## Chains
 
 
 
 
 
 
91
 
92
+ |Chain |Language|Status|
93
+ |---------------------------------|--------|------|
94
+ |Ethereum, Avalanche, Camp Network|Solidity|Live |
95
+ |Canton Network |Daml |v3 |
96
+ |Vara Network |Rust |v3 |
97
 
98
+ -----
99
 
100
+ ## Roadmap
101
 
102
+ |Version|Ships |
103
+ |-------|---------------------------------------|
104
+ |v1.0 |Solidity — 2B params |
105
+ |v2.0 |DISL + Zellic corpus — 7,800+ examples |
106
+ |v3.0 |Daml + Rust support |
107
+ |v4.0 |Weekly retraining flywheel on user data|
108
 
109
+ -----
 
 
 
 
 
110
 
111
+ ## Attribution
112
 
113
+ Fine-tuned from [Chain-GPT/Solidity-LLM](https://huggingface.co/Chain-GPT/Solidity-LLM), itself based on [Salesforce/codegen-2B-multi](https://huggingface.co/Salesforce/codegen-2B-multi). MIT License.
 
 
114
 
115
+ -----
 
 
116
 
117
+ [ginie.xyz](https://ginie.xyz) · Canton Network processes $6T in tokenised assets backed by Goldman Sachs, JPMorgan, and DTCC.
 
 
118
 
119
+ > Generated contracts require expert review before production deployment.