Upload README.md with huggingface_hub
Browse files
README.md
CHANGED
|
@@ -22,9 +22,8 @@ inference:
|
|
| 22 |
|
| 23 |
# 🧠 Mini-LLM — 80M Parameter Transformer (Pretrained From Scratch)
|
| 24 |
|
| 25 |
-
|
| 26 |
-
|
| 27 |
-
</p>
|
| 28 |
|
| 29 |
**Mini-LLM** is an 80M parameter decoder-only transformer trained **fully from scratch** using a custom tokenizer, custom architecture, and custom training loop.
|
| 30 |
It is designed as an educational + research-friendly minimal LLM that demonstrates how modern LLM components are built end-to-end.
|
|
|
|
| 22 |
|
| 23 |
# 🧠 Mini-LLM — 80M Parameter Transformer (Pretrained From Scratch)
|
| 24 |
|
| 25 |
+
[]()
|
| 26 |
+
[]()
|
|
|
|
| 27 |
|
| 28 |
**Mini-LLM** is an 80M parameter decoder-only transformer trained **fully from scratch** using a custom tokenizer, custom architecture, and custom training loop.
|
| 29 |
It is designed as an educational + research-friendly minimal LLM that demonstrates how modern LLM components are built end-to-end.
|