Instructions to use delmaksym/aacl22.scale_post with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use delmaksym/aacl22.scale_post with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("feature-extraction", model="delmaksym/aacl22.scale_post")# Load model directly from transformers import AutoTokenizer, AutoModel tokenizer = AutoTokenizer.from_pretrained("delmaksym/aacl22.scale_post") model = AutoModel.from_pretrained("delmaksym/aacl22.scale_post") - Notebooks
- Google Colab
- Kaggle
Create README.md
Browse files
README.md
ADDED
|
@@ -0,0 +1,4 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
Trained from scratch multilingual language model (XLM-Roberta architecture) from our AACL 2022 paper Cross-lingual Similarity of Multilingual Representations Revisited.
|
| 2 |
+
|
| 3 |
+
Paper (model and training description): https://aclanthology.org/2022.aacl-main.15/ </br>
|
| 4 |
+
GitHub repo: https://github.com/delmaksym/xsim#cross-lingual-similarity-of-multilingual-representations-revisited
|