universal-dependencies/universal_dependencies
Updated • 7.28k • 34
How to use KoichiYasuoka/roberta-base-coptic-upos with Transformers:
# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("token-classification", model="KoichiYasuoka/roberta-base-coptic-upos") # Load model directly
from transformers import AutoTokenizer, AutoModelForTokenClassification
tokenizer = AutoTokenizer.from_pretrained("KoichiYasuoka/roberta-base-coptic-upos")
model = AutoModelForTokenClassification.from_pretrained("KoichiYasuoka/roberta-base-coptic-upos")This is a RoBERTa model pre-trained with UD_Coptic for POS-tagging and dependency-parsing, derived from roberta-base-coptic. Every word is tagged by UPOS (Universal Part-Of-Speech).
from transformers import AutoTokenizer,AutoModelForTokenClassification
tokenizer=AutoTokenizer.from_pretrained("KoichiYasuoka/roberta-base-coptic-upos")
model=AutoModelForTokenClassification.from_pretrained("KoichiYasuoka/roberta-base-coptic-upos")
or
import esupar
nlp=esupar.load("KoichiYasuoka/roberta-base-coptic-upos")
esupar: Tokenizer POS-tagger and Dependency-parser with BERT/RoBERTa/DeBERTa models
Base model
KoichiYasuoka/roberta-base-coptic