Instructions to use albert/albert-base-v2 with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use albert/albert-base-v2 with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="albert/albert-base-v2")# Load model directly from transformers import AutoTokenizer, AutoModelForMaskedLM tokenizer = AutoTokenizer.from_pretrained("albert/albert-base-v2") model = AutoModelForMaskedLM.from_pretrained("albert/albert-base-v2") - Inference
- Notebooks
- Google Colab
- Kaggle
[FIX] Fix Typo
#3
by tony9402 - opened
README.md
CHANGED
|
@@ -113,7 +113,7 @@ and in TensorFlow:
|
|
| 113 |
|
| 114 |
```python
|
| 115 |
from transformers import AlbertTokenizer, TFAlbertModel
|
| 116 |
-
tokenizer = AlbertTokenizer.from_pretrained('albert-base-v2'
|
| 117 |
model = TFAlbertModel.from_pretrained("albert-base-v2)
|
| 118 |
text = "Replace me by any text you'd like."
|
| 119 |
encoded_input = tokenizer(text, return_tensors='tf')
|
|
|
|
| 113 |
|
| 114 |
```python
|
| 115 |
from transformers import AlbertTokenizer, TFAlbertModel
|
| 116 |
+
tokenizer = AlbertTokenizer.from_pretrained('albert-base-v2')
|
| 117 |
model = TFAlbertModel.from_pretrained("albert-base-v2)
|
| 118 |
text = "Replace me by any text you'd like."
|
| 119 |
encoded_input = tokenizer(text, return_tensors='tf')
|