Instructions to use TurkuNLP/eccobert-base-cased-v1 with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use TurkuNLP/eccobert-base-cased-v1 with Transformers:
# Load model directly from transformers import AutoTokenizer, AutoModelForPreTraining tokenizer = AutoTokenizer.from_pretrained("TurkuNLP/eccobert-base-cased-v1") model = AutoModelForPreTraining.from_pretrained("TurkuNLP/eccobert-base-cased-v1") - Notebooks
- Google Colab
- Kaggle
Quick Links
ECCO-BERT base model (cased)
A pretrained BERT model trained exclusively on the ECCO (Eighteenth Century Collections Online) dataset of digitized documents published during the 18th century in the United Kingdom. The model is equivalent in size to bert-base-cased. The model is intended for fine-tuning on various tasks that use the ECCO dataset.
Documentation in progress...
- Downloads last month
- 219
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐ Ask for provider support
# Load model directly from transformers import AutoTokenizer, AutoModelForPreTraining tokenizer = AutoTokenizer.from_pretrained("TurkuNLP/eccobert-base-cased-v1") model = AutoModelForPreTraining.from_pretrained("TurkuNLP/eccobert-base-cased-v1")