Instructions to use witiko/mathberta with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use witiko/mathberta with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="witiko/mathberta")# Load model directly from transformers import AutoTokenizer, AutoModelForMaskedLM tokenizer = AutoTokenizer.from_pretrained("witiko/mathberta") model = AutoModelForMaskedLM.from_pretrained("witiko/mathberta") - Inference
- Notebooks
- Google Colab
- Kaggle
Add link to paper
Browse files
README.md
CHANGED
|
@@ -9,11 +9,11 @@ datasets:
|
|
| 9 |
# MathBERTa model
|
| 10 |
|
| 11 |
Pretrained model on English language and LaTeX using a masked language modeling
|
| 12 |
-
(MLM) objective. It was
|
| 13 |
-
|
| 14 |
-
|
| 15 |
|
| 16 |
-
[1]:
|
| 17 |
[2]: https://github.com/witiko/scm-at-arqmath3
|
| 18 |
|
| 19 |
## Model description
|
|
|
|
| 9 |
# MathBERTa model
|
| 10 |
|
| 11 |
Pretrained model on English language and LaTeX using a masked language modeling
|
| 12 |
+
(MLM) objective. It was introduced in [this paper][1] and first released in
|
| 13 |
+
[this repository][2]. This model is case-sensitive: it makes a difference
|
| 14 |
+
between english and English.
|
| 15 |
|
| 16 |
+
[1]: http://ceur-ws.org/Vol-3180/paper-06.pdf
|
| 17 |
[2]: https://github.com/witiko/scm-at-arqmath3
|
| 18 |
|
| 19 |
## Model description
|