spoodddddd commited on
Commit
180b996
Β·
verified Β·
1 Parent(s): ce0dfd8

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +32 -6
README.md CHANGED
@@ -1,10 +1,36 @@
1
  ---
2
- title: README
3
- emoji: 🐒
4
- colorFrom: blue
5
- colorTo: purple
6
  sdk: static
7
  pinned: false
8
  ---
9
-
10
- Edit this `README.md` markdown file to author your organization card.
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
+ title: Opus Research
3
+ emoji: 🧠
4
+ colorFrom: purple
5
+ colorTo: blue
6
  sdk: static
7
  pinned: false
8
  ---
9
+ <div align="center">
10
+ <h1>🧠 Opus Research</h1>
11
+ <p><strong>Training AI models from scratch, one parameter at a time.</strong></p>
12
+ <p><em>"We stand at the right place at the right time."</em></p>
13
+ </div>
14
+ ---
15
+ ## πŸ‘‹ About Us
16
+ We're two teenage AI enthusiasts (ages 15 & 17) passionate about understanding AI from the ground up. Instead of just using pre-trained models, we build them ourselves.
17
+ ## πŸš€ Our Models
18
+ | Model | Parameters | Architecture | Status |
19
+ |-------|-----------|--------------|--------|
20
+ | [Opus 1.5](https://huggingface.co/opus-research/opus-1.5) | 0.88B | LLaMA-style | βœ… Released |
21
+ | Opus 2.0 | 3B+ | LLaMA + Reasoning | πŸ”œ Coming Soon |
22
+ ## πŸ”¬ Research Focus
23
+ - **Training from scratch** - No pre-trained weights, 100% original
24
+ - **Chain-of-thought reasoning** - Teaching models to think before answering
25
+ - **Efficient architectures** - Sub-3B models that run on consumer GPUs
26
+ ## πŸ“Š Opus 1.5 Highlights
27
+ - **0.88 billion parameters**
28
+ - **~2 GB VRAM** for inference
29
+ - **42 hours** training on 2x RTX 4090
30
+ - **LLaMA architecture** with RoPE, SwiGLU, GQA, FlashAttention-2
31
+ ## πŸ”— Links
32
+ - πŸ“¦ [Opus 1.5 on HuggingFace](https://huggingface.co/opus-research/opus-1.5)
33
+ ---
34
+ <div align="center">
35
+ <p>Made with ❀️ and way too much GPU power</p>
36
+ </div>