awni louis-jan commited on
Commit
257dfce
·
1 Parent(s): 48d5736

Missing tie_word_embeddings in config.json causes incorrect weight tying (#1)

Browse files

- Missing tie_word_embeddings in config.json causes incorrect weight tying (82d1e1bb44c017b42bc996e799b9cc7914b2f5f7)


Co-authored-by: Louis <louis-jan@users.noreply.huggingface.co>

Files changed (1) hide show
  1. config.json +2 -1
config.json CHANGED
@@ -45,7 +45,8 @@
45
  },
46
  "rope_theta": 5000000,
47
  "use_cache": true,
48
- "vocab_size": 151936
 
49
  },
50
  "tie_word_embeddings": false,
51
  "transformers_version": "4.57.0",
 
45
  },
46
  "rope_theta": 5000000,
47
  "use_cache": true,
48
+ "vocab_size": 151936,
49
+ "tie_word_embeddings": false
50
  },
51
  "tie_word_embeddings": false,
52
  "transformers_version": "4.57.0",