Hugging Face's logo Hugging Face
  • Models
  • Datasets
  • Spaces
  • Buckets new
  • Docs
  • Enterprise
  • Pricing

  • Log In
  • Sign Up

natope
/
qa_bm25_small_sample

Transformers
PyTorch
TensorBoard
mt5
text2text-generation
Model card Files Files and versions
xet
Metrics Training metrics Community
1

Instructions to use natope/qa_bm25_small_sample with libraries, inference providers, notebooks, and local apps. Follow these links to get started.

  • Libraries
  • Transformers

    How to use natope/qa_bm25_small_sample with Transformers:

    # Load model directly
    from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
    
    tokenizer = AutoTokenizer.from_pretrained("natope/qa_bm25_small_sample")
    model = AutoModelForSeq2SeqLM.from_pretrained("natope/qa_bm25_small_sample")
  • Notebooks
  • Google Colab
  • Kaggle
qa_bm25_small_sample / runs
47.2 kB
Ctrl+K
Ctrl+K
  • 1 contributor
History: 2 commits
natope's picture
natope
Training in progress, step 500
eebb46e almost 3 years ago
  • May30_17-56-11_9b9b296baa2c
    Training in progress, step 500 almost 3 years ago
  • May30_18-00-50_9b9b296baa2c
    Training in progress, step 500 almost 3 years ago
  • May30_18-13-03_9b9b296baa2c
    Training in progress, step 500 almost 3 years ago
  • May30_18-19-57_9b9b296baa2c
    Training in progress, step 500 almost 3 years ago