File size: 771 Bytes
a2cd974
 
 
 
 
 
 
 
 
 
 
 
 
62faa74
 
 
 
 
 
 
 
 
 
 
a2cd974
 
 
 
 
 
 
 
 
 
 
 
62faa74
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
---
language:
- en
base_model: ibm-granite/granite-4.0-1b
tags:
- granite
- ibm
- full-finetune
- multi-gpu
- pytorch
- code
- text-generation
pipeline_tag: text-generation
datasets:
- Antired/tradehax-xai-grok-trading-visual-prompts
- lvogel123/arc-agi-1-grok-4
- Crownelius/Hyper-Creative-Grok-V1
- Crownelius/Hyper-Logic-Grok-V1
- Crownelius/Hyper-UltraData-Grok-V1
- TeichAI/brainstorm-v3.1-grok-4-fast-200x
- TeichAI/grok-code-fast-1-1000x
- nmayorga7/math-grok-4
- Liontix/grok-code-fast-1-200x
- Antired/tradehax-xai-grok-image-capabilities
---

# IBM-Grok4-UltraFast-Coder-1B

This model is a full fine-tuned derivative of ibm-granite/granite-4.0-1b.

Training notes:
- Full model fine-tuning
- No adapters
- No LoRA
- No QLoRA
- Dual-GPU DDP training on Kaggle
-