Instructions to use microsoft/deberta-v3-small with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use microsoft/deberta-v3-small with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="microsoft/deberta-v3-small")# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("microsoft/deberta-v3-small", dtype="auto") - Inference
- Notebooks
- Google Colab
- Kaggle
Omar Sanseviero commited on
Commit ·
6b34656
1
Parent(s): 23bfba9
Add architecture to model repo
Browse files- config.json +3 -0
config.json
CHANGED
|
@@ -2,6 +2,9 @@
|
|
| 2 |
"model_type": "deberta-v2",
|
| 3 |
"attention_probs_dropout_prob": 0.1,
|
| 4 |
"hidden_act": "gelu",
|
|
|
|
|
|
|
|
|
|
| 5 |
"hidden_dropout_prob": 0.1,
|
| 6 |
"hidden_size": 768,
|
| 7 |
"initializer_range": 0.02,
|
|
|
|
| 2 |
"model_type": "deberta-v2",
|
| 3 |
"attention_probs_dropout_prob": 0.1,
|
| 4 |
"hidden_act": "gelu",
|
| 5 |
+
"architectures": [
|
| 6 |
+
"DebertaV2ForMaskedLM"
|
| 7 |
+
],
|
| 8 |
"hidden_dropout_prob": 0.1,
|
| 9 |
"hidden_size": 768,
|
| 10 |
"initializer_range": 0.02,
|