Update README.md
Browse files
README.md
CHANGED
|
@@ -26,6 +26,7 @@ base_model: meta-llama/LLaMA-2-7B
|
|
| 26 |
### Overview
|
| 27 |
|
| 28 |
This model is a distilled version of LLaMA 2, containing approximately 80 million parameters. It was trained using a mix of OpenWebText and WikiText Raw V1 datasets. Knowledge distillation was employed to transfer knowledge from a larger "teacher" model—Meta’s 7B LLaMA 2—to help this smaller model mimic the behavior of the teacher.
|
|
|
|
| 29 |
|
| 30 |
### Model Architecture
|
| 31 |
|
|
@@ -39,62 +40,6 @@ The architecture is based on LLaMA 2, with the following parameters:
|
|
| 39 |
| Transformer Layers | 16 |
|
| 40 |
|
| 41 |
|
| 42 |
-
### Training Process
|
| 43 |
-
|
| 44 |
-
During each training step, the input data \( X \) is fed to both the teacher and student models. The student model calculates output logits and loss with the true labels, while the teacher model only generates logits. The total loss combines task-specific loss and distillation loss:
|
| 45 |
-
|
| 46 |
-
```python
|
| 47 |
-
def distillation_loss(student_logits, teacher_logits, temperature=2.0):
|
| 48 |
-
return F.kl_div(
|
| 49 |
-
F.log_softmax(student_logits / temperature, dim=-1),
|
| 50 |
-
F.softmax(teacher_logits / temperature, dim=-1),
|
| 51 |
-
reduction='batchmean'
|
| 52 |
-
) * (temperature ** 2)
|
| 53 |
-
|
| 54 |
-
# Loss Calculation
|
| 55 |
-
loss = (alpha * distill_loss) + ((1 - alpha) * task_loss)
|
| 56 |
-
```
|
| 57 |
-
|
| 58 |
-
### Training Configuration
|
| 59 |
-
|
| 60 |
-
- **Batch Size**: 64
|
| 61 |
-
- **Max Sequence Length**: 128
|
| 62 |
-
- **Epochs**: 2
|
| 63 |
-
- **Log Interval**: 3000
|
| 64 |
-
- **Learning Rate**: 3e-4
|
| 65 |
-
- **Warmup Steps**: 4000
|
| 66 |
-
- **Accumulation Steps**: 8
|
| 67 |
-
- **Load Model**: True
|
| 68 |
-
- **Temperature**: 2.0
|
| 69 |
-
- **Alpha**: 0.3
|
| 70 |
-
|
| 71 |
-
### Performance Comparison
|
| 72 |
-
|
| 73 |
-
The model’s performance is evaluated on 200 queries created in-house. For more details, visit the [GitHub repository](https://github.com/svarnim1805/Knowledge-Distillation/tree/main).
|
| 74 |
-
|
| 75 |
-
#### Example Queries
|
| 76 |
-
|
| 77 |
-
| **Query** | **Keyword** |
|
| 78 |
-
|----------------------------------------------|-------------------|
|
| 79 |
-
| The capital of France is | Paris |
|
| 80 |
-
| The author of *To Kill a Mockingbird* is | Harper Lee |
|
| 81 |
-
| The planet known as the Red Planet is | Mars |
|
| 82 |
-
| The smallest country in the world is | Vatican City |
|
| 83 |
-
| The *Mona Lisa* was painted by | Leonardo da Vinci |
|
| 84 |
-
| The tallest mountain in the world is | Mount Everest |
|
| 85 |
-
| The host country of the 2016 Summer Olympics | Brazil |
|
| 86 |
-
| The currency of Japan is | Yen |
|
| 87 |
-
| The "Father of Computers" is | Charles Babbage |
|
| 88 |
-
| The largest ocean on Earth is | Pacific Ocean |
|
| 89 |
-
|
| 90 |
-
### Text Generation Examples
|
| 91 |
-
|
| 92 |
-
1. **Input**: *The capital of France is*
|
| 93 |
-
- **Output**: "The capital of France is located in the southern province of Lyon, France. The capital is the main hub of the French capital, La Caillion, and the main hub of the French capital. A major part of the French capital is in the southern city of Paris."
|
| 94 |
-
|
| 95 |
-
2. **Input**: *The tallest mountain in the world is*
|
| 96 |
-
- **Output**: "The tallest mountain in the world is Mount Everest, which is located in the northwest corner of the village of Kahuna. The mountain is about 1,000 feet (3,000 m) above sea level."
|
| 97 |
-
|
| 98 |
### Evaluation Metrics
|
| 99 |
|
| 100 |
1. **Cosine Similarity using Word Embeddings**
|
|
@@ -121,14 +66,20 @@ The model’s performance is evaluated on 200 queries created in-house. For more
|
|
| 121 |
| LLaMA-2-7B-HF | 18215.61 | 1.84e-01 | 0.715 | 0.7257 | 0.0821 |
|
| 122 |
| baby-llama-58m | 57.20 | 2.73e-06 | 0.025 | 0.6556 | 0.0097 |
|
| 123 |
| DistilLlama | 77.12 | 7.79e-04 | 0.02 | 0.6623 | 0.0115 |
|
|
|
|
| 124 |
|
| 125 |
*Note: CodeCarbon was used to track carbon emission. Allocated 80GB memory, 32 cores, Intel(R) Xeon(R) Gold 6448H for the evaluation*
|
| 126 |
|
| 127 |
-
###
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 128 |
|
| 129 |
-
- **University of Melbourne**
|
| 130 |
-
- **AGL Energy**
|
| 131 |
-
- **My teammates**: Svarnim and Mohit
|
| 132 |
|
| 133 |
### GitHub Repositories
|
| 134 |
|
|
|
|
| 26 |
### Overview
|
| 27 |
|
| 28 |
This model is a distilled version of LLaMA 2, containing approximately 80 million parameters. It was trained using a mix of OpenWebText and WikiText Raw V1 datasets. Knowledge distillation was employed to transfer knowledge from a larger "teacher" model—Meta’s 7B LLaMA 2—to help this smaller model mimic the behavior of the teacher.
|
| 29 |
+
This version is the latest version of DistilLlama, which has gone through 5 days of training using two Nvidia A100 80G GPU.
|
| 30 |
|
| 31 |
### Model Architecture
|
| 32 |
|
|
|
|
| 40 |
| Transformer Layers | 16 |
|
| 41 |
|
| 42 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 43 |
### Evaluation Metrics
|
| 44 |
|
| 45 |
1. **Cosine Similarity using Word Embeddings**
|
|
|
|
| 66 |
| LLaMA-2-7B-HF | 18215.61 | 1.84e-01 | 0.715 | 0.7257 | 0.0821 |
|
| 67 |
| baby-llama-58m | 57.20 | 2.73e-06 | 0.025 | 0.6556 | 0.0097 |
|
| 68 |
| DistilLlama | 77.12 | 7.79e-04 | 0.02 | 0.6623 | 0.0115 |
|
| 69 |
+
| DistilLlamaV1 | 78.46 | 8.49e-04 | 0.065 | 0.6776 | 0.0135 |
|
| 70 |
|
| 71 |
*Note: CodeCarbon was used to track carbon emission. Allocated 80GB memory, 32 cores, Intel(R) Xeon(R) Gold 6448H for the evaluation*
|
| 72 |
|
| 73 |
+
### Example queries from our test set
|
| 74 |
+
|
| 75 |
+
| Query | Keyword | Response | Exact Match | Cosine Similarity | ROUGE Score |
|
| 76 |
+
|--------------------------------------------|------------------|-----------------------------------------------------------------------------------------------|-------------|-------------------|-------------|
|
| 77 |
+
| The capital of France is | Paris | The capital of France is the city of Paris, Paris is the capital and most populous city... | 1 | 0.757961 | 0.0625 |
|
| 78 |
+
| The currency of Japan is | Yen | The currency of Japan is the Japanese currency called Yen. | 1 | 0.774518 | 0.083333 |
|
| 79 |
+
| The largest ocean on Earth is | Pacific Ocean | The largest ocean on Earth is Pacific Ocean. | 0 | 0.721646 | 0.222222 |
|
| 80 |
+
| The continent known as the 'Dark Continent' is | Africa | The continent known as the 'Dark Continent' is Africa. | 1 | 0.725292 | 0.057143 |
|
| 81 |
+
| The theory of relativity was developed by | Albert Einstein | The theory of relativity was developed by Einstein, a famous physicist who... | 0 | 0.712056 | 0.055556 |
|
| 82 |
|
|
|
|
|
|
|
|
|
|
| 83 |
|
| 84 |
### GitHub Repositories
|
| 85 |
|