File size: 1,780 Bytes
ce0dfd8
180b996
 
 
 
ce0dfd8
 
 
323cc3b
180b996
 
 
 
 
323cc3b
180b996
323cc3b
180b996
323cc3b
180b996
323cc3b
180b996
 
 
 
323cc3b
180b996
323cc3b
180b996
 
 
323cc3b
180b996
323cc3b
180b996
 
 
 
323cc3b
180b996
323cc3b
180b996
323cc3b
d08e885
 
 
a07d848
d08e885
 
 
 
 
 
 
180b996
d08e885
180b996
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
---
title: Opus Research
emoji: ๐Ÿง 
colorFrom: purple
colorTo: blue
sdk: static
pinned: false
---

<div align="center">
  <h1>๐Ÿง  Opus Research</h1>
  <p><strong>Training AI models from scratch, one parameter at a time.</strong></p>
  <p><em>"We stand at the right place at the right time."</em></p>
</div>

## ๐Ÿ‘‹ About Us

We're two teenage AI enthusiasts (ages 15 & 17) passionate about understanding AI from the ground up. Instead of just using pre-trained models, we build them ourselves.

## ๐Ÿš€ Our Models

| Model | Parameters | Architecture | Status |
|-------|-----------|--------------|--------|
| [Opus 1.5](https://huggingface.co/opus-research/opus-1.5) | 0.88B | LLaMA-style | โœ… Released |
| Opus 2.0 | 3B+ | LLaMA + Reasoning | ๐Ÿ”œ Coming Soon |

## ๐Ÿ”ฌ Research Focus

- **Training from scratch** - No pre-trained weights, 100% original
- **Chain-of-thought reasoning** - Teaching models to think before answering
- **Efficient architectures** - Sub-3B models that run on consumer GPUs

## ๐Ÿ“Š Opus 1.5 Highlights

- **0.88 billion parameters**
- **~2 GB VRAM** for inference
- **42 hours** training on 2x RTX 4090
- **LLaMA architecture** with RoPE, SwiGLU, GQA, FlashAttention-2

## ๐Ÿ”— Links

- ๐Ÿ“ฆ [Opus 1.5 on HuggingFace](https://huggingface.co/opus-research/opus-1.5)

## โšก Powered By

- โค๏ธ Two teenagers who should probably be doing homework
- ๐ŸŽฎ Two RTX 4090s that haven't known rest since December 2025
- ๐Ÿ”ฅ Questionable GPU thermals
- ๐Ÿ“Š A concerning amount of time staring at loss curves
- ๐Ÿ’ก 48GB of VRAM
- โšก ~847 kWh of electricity we'll never get back
- ๐Ÿ’ธ Our parents' electricity bill
- ๐Ÿค” The firm belief that we could've just used ChatGPT

<div align="center">
  <p><em>But where's the fun in that?</em></p>
</div>