Update the number of parameters
Browse filesThe number of parameters seems to be wrong for this model; I loaded it and saved it once again to check the real number of parameters.
I don't know how that number got here in the first place. If you recall how you saved the checkpoint and if you think this is an issue with transformers, we'd be happy to take a look and solve the issue.
Thanks!
model.safetensors.index.json
CHANGED
|
@@ -1,6 +1,6 @@
|
|
| 1 |
{
|
| 2 |
"metadata": {
|
| 3 |
-
"total_parameters":
|
| 4 |
"total_size": 64467044352
|
| 5 |
},
|
| 6 |
"weight_map": {
|
|
|
|
| 1 |
{
|
| 2 |
"metadata": {
|
| 3 |
+
"total_parameters": 32233522176,
|
| 4 |
"total_size": 64467044352
|
| 5 |
},
|
| 6 |
"weight_map": {
|