ykae commited on
Commit
01ab50d
·
verified ·
1 Parent(s): f2aedc8

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +0 -1
README.md CHANGED
@@ -56,7 +56,6 @@ Standard compression and distillation often requires massive retraining. We prov
56
 
57
  * **Training Time:** A few hours on **1x NVIDIA H100**.
58
  * **Data:** Only **MNLI** + **500k Wikipedia Samples**.
59
- * **Math over Brute Force:** By replacing all FFNs with **Monarch Matrices** $O(N \log N)$, we reduced the mathematical complexity (GFLOPs) by **66%**.
60
  * **Trade-off:** This extreme compression comes with a moderate accuracy drop (~5%). *Need higher accuracy? Check out our [Hybrid Version](https://huggingface.co/ykae/monarch-bert-base-mnli-hybrid) (<1% loss).*
61
 
62
  ## 🚀 Key Benchmarks
 
56
 
57
  * **Training Time:** A few hours on **1x NVIDIA H100**.
58
  * **Data:** Only **MNLI** + **500k Wikipedia Samples**.
 
59
  * **Trade-off:** This extreme compression comes with a moderate accuracy drop (~5%). *Need higher accuracy? Check out our [Hybrid Version](https://huggingface.co/ykae/monarch-bert-base-mnli-hybrid) (<1% loss).*
60
 
61
  ## 🚀 Key Benchmarks