Instructions to use albert/albert-base-v2 with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use albert/albert-base-v2 with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="albert/albert-base-v2")# Load model directly from transformers import AutoTokenizer, AutoModelForMaskedLM tokenizer = AutoTokenizer.from_pretrained("albert/albert-base-v2") model = AutoModelForMaskedLM.from_pretrained("albert/albert-base-v2") - Inference
- Notebooks
- Google Colab
- Kaggle
Updated bug in TensorFlow usage code (README.md)
#5
by pranshupant - opened
README.md
CHANGED
|
@@ -114,7 +114,7 @@ and in TensorFlow:
|
|
| 114 |
```python
|
| 115 |
from transformers import AlbertTokenizer, TFAlbertModel
|
| 116 |
tokenizer = AlbertTokenizer.from_pretrained('albert-base-v2')
|
| 117 |
-
model = TFAlbertModel.from_pretrained("albert-base-v2)
|
| 118 |
text = "Replace me by any text you'd like."
|
| 119 |
encoded_input = tokenizer(text, return_tensors='tf')
|
| 120 |
output = model(encoded_input)
|
|
|
|
| 114 |
```python
|
| 115 |
from transformers import AlbertTokenizer, TFAlbertModel
|
| 116 |
tokenizer = AlbertTokenizer.from_pretrained('albert-base-v2')
|
| 117 |
+
model = TFAlbertModel.from_pretrained("albert-base-v2")
|
| 118 |
text = "Replace me by any text you'd like."
|
| 119 |
encoded_input = tokenizer(text, return_tensors='tf')
|
| 120 |
output = model(encoded_input)
|