hubertusz-tiny-wiki-seq128 (ONNX)
This is an ONNX version of SzegedAI/hubertusz-tiny-wiki-seq128. It was automatically converted and uploaded using this Hugging Face Space.
Usage with Transformers.js
See the pipelines documentation: https://huggingface.co/docs/transformers.js/api/pipelines
hubert-tiny-wiki-seq128
Fully trained model with the second phase of training is available here: SzegedAI/hubert-tiny-wiki
This model was trained from scratch on the Wikipedia subset of Hungarian Webcorpus 2.0 with MLM and SOP tasks.
Pre-Training Parameters:
- Training steps: 500.000
- Sequence length: 128 (the model is capable for 512)
- Batch size: 1024
Framework versions
- Transformers 4.21.3
- TensorFlow 2.10.0
- Datasets 2.4.0
- Tokenizers 0.12.1
Acknowledgement
- Downloads last month
- 16
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support
Model tree for onnx-community/hubertusz-tiny-wiki-seq128-ONNX
Base model
SzegedAI/hubertusz-tiny-wiki-seq128