File size: 1,920 Bytes
505770c
 
 
 
 
 
 
 
 
 
 
 
 
96a7f1a
 
3435e6e
 
96a7f1a
3435e6e
 
 
 
 
 
 
 
 
 
 
 
96a7f1a
3435e6e
 
 
96a7f1a
3435e6e
 
 
 
96a7f1a
125c8f2
 
 
 
 
 
 
 
 
 
 
3435e6e
96a7f1a
 
3435e6e
 
96a7f1a
 
 
3435e6e
96a7f1a
 
3435e6e
96a7f1a
 
 
3435e6e
96a7f1a
 
 
 
 
 
 
 
3435e6e
 
 
96a7f1a
85fbfaf
 
 
 
 
 
 
 
 
 
125c8f2
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
---
license: apache-2.0
library_name: pytorch
tags:
  - language-model
  - causal-lm
  - gpt
  - from-scratch
  - educational
pipeline_tag: text-generation
framework: pytorch
---

# SimBot GPT (Level 1)

SimBot GPT is a **from-scratch GPT-style language model** implemented in **PyTorch**.
This project is focused on **learning LLM internals**, not on instruction tuning or production use.

---

## Model Overview

- **Architecture:** Decoder-only Transformer (GPT-like)
- **Training Objective:** Causal Language Modeling
- **Dataset:** Domain-specific text (Simdega / regional data)
- **Purpose:** Educational (understanding how LLMs work internally)

⚠️ This is a **base language model**, not instruction-tuned and not grounded with RAG.

---

## Repository Contents

- `simbot.safetensors` — model weights (safe & HF-recommended format)
- `tokenizer.json` — BPE tokenizer
- `config.json` — model hyperparameters
- `model/simbot.py` — model architecture (PyTorch)

---

## Requirements (Inference Only)

The following packages are **required to load and run the model**:

```txt
torch==2.9.1
tokenizers==0.22.1
safetensors
```
---

## Usage Example

```python
import json
from safetensors.torch import load_file
from tokenizers import Tokenizer
from model.simbot import SIMGPT

# Load tokenizer
tokenizer = Tokenizer.from_file("tokenizer.json")

# Load config
with open("config.json") as f:
    cfg = json.load(f)

# Build model
model = SIMGPT(
    vocab_size=cfg["vocab_size"],
    block_size=cfg["block_size"],
    n_layers=cfg["n_layers"],
    n_heads=cfg["n_heads"],
    d_model=cfg["d_model"]
)

# Load weights
state_dict = load_file("simbot.safetensors")
model.load_state_dict(state_dict)
model.eval()
```

## Prompting the Model

This model is a custom PyTorch implementation and does not support the Hugging Face inference widget.

### Interactive Usage (Recommended)

```bash
python inference.py