Update README.md
Browse files
README.md
CHANGED
|
@@ -26,6 +26,9 @@ We present Amber, the first model in the LLM360 family. Amber is an
|
|
| 26 |
|
| 27 |
Amber is not a SOTA model. Amber is released to make LLM training knowledge accessible to all.
|
| 28 |
|
|
|
|
|
|
|
|
|
|
| 29 |
## Final 10 Checkpoints
|
| 30 |
| Checkpoints | |
|
| 31 |
| ----------- | ----------- |
|
|
@@ -70,25 +73,6 @@ print(tokenizer.decode(outputs[0]))
|
|
| 70 |
| Total | 1259.13 |
|
| 71 |
|
| 72 |
|
| 73 |
-
| Training Loss |
|
| 74 |
-
|------------------------------------------------------------|
|
| 75 |
-
| <img src="loss_curve.png" alt="loss curve" width="400"/> |
|
| 76 |
-
|
| 77 |
-
|
| 78 |
-
# 🟠 Evaluation
|
| 79 |
-
|
| 80 |
-
Please refer to our [W&B project page](https://wandb.ai/llm360/CrystalCoder) for complete training logs and evaluation results.
|
| 81 |
-
|
| 82 |
-
| ARC | HellaSwag |
|
| 83 |
-
|--------------------------------------------------------|--------------------------------------------------------------------|
|
| 84 |
-
| <img src="amber-arc-curve.png" alt="arc" width="400"/> | <img src="amber-hellaswag-curve.png" alt="hellaswag" width="400"/> |
|
| 85 |
-
|
| 86 |
-
|MMLU | TruthfulQA |
|
| 87 |
-
|-----------------------------------------------------|-----------------------------------------------------------|
|
| 88 |
-
|<img src="amber-mmlu-curve.png" alt="mmlu" width="400"/> | <img src="amber-truthfulqa-curve.png" alt="truthfulqa" width="400"/> |
|
| 89 |
-
|
| 90 |
-
Get access now at [LLM360 site](https://www.llm360.ai/)
|
| 91 |
-
|
| 92 |
## 🟠 Model Description
|
| 93 |
|
| 94 |
- **Model type:** Language model with the same architecture as LLaMA-7B
|
|
|
|
| 26 |
|
| 27 |
Amber is not a SOTA model. Amber is released to make LLM training knowledge accessible to all.
|
| 28 |
|
| 29 |
+
Please refer to our [W&B project page](https://wandb.ai/llm360/Amber?nw=lnzi8o2g4z) for complete training logs and evaluation results.
|
| 30 |
+
|
| 31 |
+
|
| 32 |
## Final 10 Checkpoints
|
| 33 |
| Checkpoints | |
|
| 34 |
| ----------- | ----------- |
|
|
|
|
| 73 |
| Total | 1259.13 |
|
| 74 |
|
| 75 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 76 |
## 🟠 Model Description
|
| 77 |
|
| 78 |
- **Model type:** Language model with the same architecture as LLaMA-7B
|