Hugging Face's logo Hugging Face
  • Models
  • Datasets
  • Spaces
  • Buckets new
  • Docs
  • Enterprise
  • Pricing

  • Log In
  • Sign Up

Jetmaxx
/
flan-t5-base-max

Transformers
TensorBoard
Safetensors
t5
text2text-generation
Generated from Trainer
Eval Results (legacy)
text-generation-inference
Model card Files Files and versions
xet
Metrics Training metrics Community

Instructions to use Jetmaxx/flan-t5-base-max with libraries, inference providers, notebooks, and local apps. Follow these links to get started.

  • Libraries
  • Transformers

    How to use Jetmaxx/flan-t5-base-max with Transformers:

    # Load model directly
    from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
    
    tokenizer = AutoTokenizer.from_pretrained("Jetmaxx/flan-t5-base-max")
    model = AutoModelForSeq2SeqLM.from_pretrained("Jetmaxx/flan-t5-base-max")
  • Notebooks
  • Google Colab
  • Kaggle
flan-t5-base-max / logs
12.8 kB
Ctrl+K
Ctrl+K
  • 1 contributor
History: 1 commit
Jetmaxx's picture
Jetmaxx
End of training
d58a406 verified over 1 year ago
  • events.out.tfevents.1734259327.ab96a84ac7b2.755.0
    12.8 kB
    xet
    End of training over 1 year ago