How to use DAMO-NLP-SG/PMR-large with Transformers:
# Load model directly from transformers import AutoTokenizer, RoBERTa_PMR tokenizer = AutoTokenizer.from_pretrained("DAMO-NLP-SG/PMR-large") model = RoBERTa_PMR.from_pretrained("DAMO-NLP-SG/PMR-large")
The community tab is the place to discuss and collaborate with the HF community!