Create README.md
Browse files
README.md
ADDED
|
@@ -0,0 +1,44 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
---
|
| 2 |
+
license: apache-2.0
|
| 3 |
+
---
|
| 4 |
+
<div align="center">
|
| 5 |
+
<img src="https://i.ibb.co/g9Z2CGQ/arcee-lite.webp" alt="Arcee-Lite" style="border-radius: 10px; box-shadow: 0 4px 8px 0 rgba(0, 0, 0, 0.2), 0 6px 20px 0 rgba(0, 0, 0, 0.19); max-width: 100%; height: auto;">
|
| 6 |
+
</div>
|
| 7 |
+
|
| 8 |
+
|
| 9 |
+
Arcee-Lite is a compact yet powerful 1.5B parameter language model developed as part of the DistillKit open-source project. Despite its small size, Arcee-Lite demonstrates impressive performance, particularly in the MMLU (Massive Multitask Language Understanding) benchmark.
|
| 10 |
+
|
| 11 |
+
## Key Features
|
| 12 |
+
|
| 13 |
+
- **Model Size**: 1.5 billion parameters
|
| 14 |
+
- **MMLU Score**: 55.93
|
| 15 |
+
- **Distillation Source**: Phi-3-Medium
|
| 16 |
+
- **Enhanced Performance**: Merged with high-performing distillations
|
| 17 |
+
|
| 18 |
+
## About DistillKit
|
| 19 |
+
|
| 20 |
+
DistillKit is our new open-source project focused on creating efficient, smaller models that maintain high performance. Arcee-Lite is one of the first models to emerge from this initiative.
|
| 21 |
+
|
| 22 |
+
## Performance
|
| 23 |
+
|
| 24 |
+
Arcee-Lite showcases remarkable capabilities for its size:
|
| 25 |
+
|
| 26 |
+
- Achieves a 55.93 score on the MMLU benchmark
|
| 27 |
+
- Demonstrates exceptional performance across various tasks
|
| 28 |
+
|
| 29 |
+
## Use Cases
|
| 30 |
+
|
| 31 |
+
Arcee-Lite is suitable for a wide range of applications where a balance between model size and performance is crucial:
|
| 32 |
+
|
| 33 |
+
- Embedded systems
|
| 34 |
+
- Mobile applications
|
| 35 |
+
- Edge computing
|
| 36 |
+
- Resource-constrained environments
|
| 37 |
+
|
| 38 |
+
<div align="center">
|
| 39 |
+
<img src="https://i.ibb.co/hDC7WBt/Screenshot-2024-08-01-at-8-59-33-AM.png" alt="Arcee-Lite" style="border-radius: 10px; box-shadow: 0 4px 8px 0 rgba(0, 0, 0, 0.2), 0 6px 20px 0 rgba(0, 0, 0, 0.19); max-width: 100%; height: auto;">
|
| 40 |
+
</div>
|
| 41 |
+
|
| 42 |
+
Please note that our internal evaluations were consistantly higher than their counterparts on the OpenLLM Leaderboard - and should only be compared against the relative performance between the models, not weighed against the leaderboard.
|
| 43 |
+
|
| 44 |
+
---
|