Instructions to use prajjwal1/bert-mini with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use prajjwal1/bert-mini with Transformers:
# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("prajjwal1/bert-mini", dtype="auto") - Notebooks
- Google Colab
- Kaggle
Add missing "model_type": "bert" to allow loading with Transformers v5.2
#6
by tomaarsen HF Staff - opened
- config.json +1 -1
config.json
CHANGED
|
@@ -1 +1 @@
|
|
| 1 |
-
{"hidden_size": 256, "hidden_act": "gelu", "initializer_range": 0.02, "vocab_size": 30522, "hidden_dropout_prob": 0.1, "num_attention_heads": 4, "type_vocab_size": 2, "max_position_embeddings": 512, "num_hidden_layers": 4, "intermediate_size": 1024, "attention_probs_dropout_prob": 0.1}
|
|
|
|
| 1 |
+
{"model_type": "bert", "hidden_size": 256, "hidden_act": "gelu", "initializer_range": 0.02, "vocab_size": 30522, "hidden_dropout_prob": 0.1, "num_attention_heads": 4, "type_vocab_size": 2, "max_position_embeddings": 512, "num_hidden_layers": 4, "intermediate_size": 1024, "attention_probs_dropout_prob": 0.1}
|