Improve model card: Add text-generation pipeline tag, transformers library, links, and sample usage for RAPO++ Prompt Rewriter
#1
by
nielsr
HF Staff
- opened
This PR significantly enhances the model card for the RAPO++ Prompt Rewriter model by:
- Adding
pipeline_tag: text-generationto correctly categorize the model as an LLM for prompt rewriting. - Specifying
library_name: transformersto enable the automated "how to use" widget, as the model is fully compatible with the Hugging Facetransformerslibrary. - Including links to the official paper (RAPO++: Cross-Stage Prompt Optimization for Text-to-Video Generation via Data Alignment and Test-Time Scaling), project page (https://whynothaha.github.io/RAPO_plus_github/), and GitHub repository (https://github.com/Vchitect/RAPO).
- Adding a
transformers-based Python sample usage code snippet, demonstrating how to use the model for prompt rewriting, which is standard for instruction-tuned LLMs on the Hub. - Including an overview image and the BibTeX citation from the GitHub repository.
These updates will improve the model's discoverability, provide essential information, and offer immediate usability examples for the community.