Transformers How to use DAMO-NLP-SG/PMR-base with Transformers:
# Load model directly
from transformers import AutoTokenizer, RoBERTa_PMR
tokenizer = AutoTokenizer.from_pretrained("DAMO-NLP-SG/PMR-base")
model = RoBERTa_PMR.from_pretrained("DAMO-NLP-SG/PMR-base")