metadata
library_name: transformers
tags: []
A llama3 model trained on 2.9 billions token from styal/LucioleData-2M.
| Metrics | luciole-115M |
|---|---|
| HellaSwag | 27.3 |
| ARC EASY | 52.8 |
| PIQA | 58.6 |
| MMLU | 23.1 |
| CommonsenseQA | 19.7 |
| Winogrande | 51.4 |
| OpenBookQA | 18.0 |
| GSM8K (5-shot) | 0.0 |