Create README.md
Browse files
README.md
ADDED
|
@@ -0,0 +1,8 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
33% pruning on RedPajama 3B linear layers
|
| 2 |
+
|
| 3 |
+
The pruned layers are:
|
| 4 |
+
1. attention linear layers (query, key, value computation)
|
| 5 |
+
2. attention dense layer
|
| 6 |
+
3. MLP layers
|
| 7 |
+
|
| 8 |
+
Pruning is done in all decoder modules. Pruning is unstructured magnitude pruning
|