Probabilistically Masked Language Model Capable of Autoregressive Generation in Arbitrary Word Order
Paper • 2004.11579 • Published
# Load model directly
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("jxm/u-PMLM-R")
model = AutoModel.from_pretrained("jxm/u-PMLM-R")YAML Metadata Warning:empty or missing yaml metadata in repo card
Check out the documentation for more information.
PMLM is the language model described in Probabilistically Masked Language Model Capable of Autoregressive Generation in Arbitrary Word Order, which is trained with probabilistic masking. This is the "PMLM-R" variant, adapted from the authors' original implementation.
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("feature-extraction", model="jxm/u-PMLM-R")