File size: 142 Bytes
b1b3c42 | 1 2 3 4 5 | from transformers import XLMRobertaForMaskedLM
model = XLMRobertaForMaskedLM.from_pretrained(".", from_flax=True)
model.save_pretrained(".")
|
b1b3c42 | 1 2 3 4 5 | from transformers import XLMRobertaForMaskedLM
model = XLMRobertaForMaskedLM.from_pretrained(".", from_flax=True)
model.save_pretrained(".")
|