YAML Metadata Error: "widget" must be an array

Chinese Ancient BERT Model

Model description

The model's architecture is the BERT-base. We trained this model in 4 P100 about 7 days. (batch size = 24, steps = 1M)

How to use

You can use the model directly with a pipeline for text generation:

>>> from transformers import pipeline
>>> unmasker = pipeline('fill-mask', model='zhuimengshaonian/bert-ancient-base')
>>> unmasker("ζ΅·ι˜”ε‡­ι±Όθ·ƒοΌŒε€©ι«˜[MASK]ιΈŸι£žγ€‚")
Downloads last month
1
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support