Instructions to use perplexity-correlations/fasttext-lambada-it-target with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- fastText
How to use perplexity-correlations/fasttext-lambada-it-target with fastText:
from huggingface_hub import hf_hub_download import fasttext model = fasttext.load_model(hf_hub_download("perplexity-correlations/fasttext-lambada-it-target", "model.bin")) - Notebooks
- Google Colab
- Kaggle
Improve model card: Add metadata, description and Github link
#1
by nielsr HF Staff - opened
This PR improves the model card by adding missing metadata (license, library_name and pipeline tag) and a link to the Github repository. The description has been updated to clarify that this is a pretraining data filter and not a language model.
Tristan changed pull request status to merged