Update README.md
Browse files
README.md
CHANGED
|
@@ -14,7 +14,7 @@ Can we create multilingual models that maintain performance comparable to their
|
|
| 14 |
Techniques:
|
| 15 |
- Pruning
|
| 16 |
- SparseGPT | [GitHub](https://github.com/VishnuVardhanSaiLanka/sparsegpt/tree/aya)
|
| 17 |
-
- ShortGPT | [Perplexity Sensivities](https://github.com/rsk2327/DistAya/tree/main)
|
| 18 |
- Knowledge Distillation
|
| 19 |
- DistillKit | [GitHub](https://github.com/ShayekhBinIslam/DistillKit)
|
| 20 |
- Distil-Whisper based method
|
|
|
|
| 14 |
Techniques:
|
| 15 |
- Pruning
|
| 16 |
- SparseGPT | [GitHub](https://github.com/VishnuVardhanSaiLanka/sparsegpt/tree/aya)
|
| 17 |
+
- ShortGPT | [KLDBasedPruning & Perplexity Sensivities](https://github.com/rsk2327/DistAya/tree/main)
|
| 18 |
- Knowledge Distillation
|
| 19 |
- DistillKit | [GitHub](https://github.com/ShayekhBinIslam/DistillKit)
|
| 20 |
- Distil-Whisper based method
|