Instructions to use yiyanghkust/finbert-pretrain with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use yiyanghkust/finbert-pretrain with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="yiyanghkust/finbert-pretrain")# Load model directly from transformers import AutoModelForMaskedLM model = AutoModelForMaskedLM.from_pretrained("yiyanghkust/finbert-pretrain", dtype="auto") - Notebooks
- Google Colab
- Kaggle
Add TF weights
#5
by joaogante - opened
Model converted by the transformers' pt_to_tf CLI.
All converted model outputs and hidden layers were validated against its Pytorch counterpart. Maximum crossload output difference=1.144e-05; Maximum converted output difference=1.144e-05.