Instructions to use twn39/bert-base-chinese-ONNX with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers.js
How to use twn39/bert-base-chinese-ONNX with Transformers.js:
// npm i @huggingface/transformers import { pipeline } from '@huggingface/transformers'; // Allocate pipeline const pipe = await pipeline('fill-mask', 'twn39/bert-base-chinese-ONNX');
bert-base-chinese (ONNX)
This is an ONNX version of google-bert/bert-base-chinese. It was automatically converted and uploaded using this space.
- Downloads last month
- 1
Model tree for twn39/bert-base-chinese-ONNX
Base model
google-bert/bert-base-chinese