Instructions to use sofom/roberta-base-turingbench-aa with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use sofom/roberta-base-turingbench-aa with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("feature-extraction", model="sofom/roberta-base-turingbench-aa")# Load model directly from transformers import AutoTokenizer, AutoModel tokenizer = AutoTokenizer.from_pretrained("sofom/roberta-base-turingbench-aa") model = AutoModel.from_pretrained("sofom/roberta-base-turingbench-aa") - Notebooks
- Google Colab
- Kaggle
- Xet hash:
- 1dc969aa45e024704fee1f9bc33780ccdc390a25b8ee05c3953e02cfb4b6cc6c
- Size of remote file:
- 249 MB
- SHA256:
- 802e53871526e791a27308bf6414d2c50a10ac87c6fe9fc747ff17751c8ac017
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.