Improve model card: license, model type, tags, and links
#1
by
nielsr HF Staff - opened
This PR improves the model card for itay1itzhak/T5-Flan by:
- Correcting the license from
apache-2.0tomit, aligning with the license stated in the accompanying GitHub repository. - Updating the model type description from "Causal decoder-based transformer" to "Encoder-Decoder transformer", which accurately reflects the T5 architecture.
- Removing the
causal-lmtag as T5 is an encoder-decoder model, not a causal language model. - Adding a link to the project page (
https://itay1itzhak.github.io/planted-in-pretraining) for easier access to related resources. - Updating the sample usage code to correctly use
AutoModelForSeq2SeqLMfor loading a T5 (encoder-decoder) model, which aligns with thetext2text-generationpipeline tag. - Removing the extraneous "File information" section, as it is internal context and not part of the standard model card content.
These changes enhance the model card's accuracy, completeness, and usability for the Hugging Face community.
itay1itzhak changed pull request status to
merged