Update model card for LongCat-Flash-Chat: correct library name and add direct links

#1
by nielsr HF Staff - opened

This PR updates the model card for meituan-longcat/LongCat-Flash-Chat.

It addresses the following:

  • Updates the library_name metadata from LongCat-Flash-Chat to transformers. This is confirmed by the presence of config.json, tokenizer_config.json, and generation_config.json (which explicitly mentions transformers_version), indicating full compatibility with the Hugging Face transformers library. This change will enable the automated "How to use" widget on the Hugging Face Hub.
  • Adds explicit links to the paper, project page, and GitHub repository at the top of the model card for better visibility.
  • Updates the internal link in the "Model Introduction" section from the GitHub PDF to the official Hugging Face paper page for consistency.

The existing comprehensive content, including the detailed descriptions, evaluation results, and quick start examples, remains unchanged as it is already well-documented.

Ready to merge
This branch is ready to get merged automatically.

Sign up or log in to comment