Improve model card: license, model type, tags, and links

#1
by nielsr HF Staff - opened

This PR improves the model card for itay1itzhak/T5-Flan by:

  • Correcting the license from apache-2.0 to mit, aligning with the license stated in the accompanying GitHub repository.
  • Updating the model type description from "Causal decoder-based transformer" to "Encoder-Decoder transformer", which accurately reflects the T5 architecture.
  • Removing the causal-lm tag as T5 is an encoder-decoder model, not a causal language model.
  • Adding a link to the project page (https://itay1itzhak.github.io/planted-in-pretraining) for easier access to related resources.
  • Updating the sample usage code to correctly use AutoModelForSeq2SeqLM for loading a T5 (encoder-decoder) model, which aligns with the text2text-generation pipeline tag.
  • Removing the extraneous "File information" section, as it is internal context and not part of the standard model card content.

These changes enhance the model card's accuracy, completeness, and usability for the Hugging Face community.

itay1itzhak changed pull request status to merged

Sign up or log in to comment