Update model card for LongCat-Flash-Chat: correct library name and add direct links

#1
by nielsr HF Staff - opened
Files changed (1) hide show
  1. README.md +7 -4
README.md CHANGED
@@ -1,6 +1,6 @@
1
  ---
 
2
  license: mit
3
- library_name: LongCat-Flash-Chat
4
  pipeline_tag: text-generation
5
  tags:
6
  - transformers
@@ -8,6 +8,10 @@ tags:
8
 
9
  # LongCat-Flash-Chat
10
 
 
 
 
 
11
  <div align="center">
12
  <img src="https://raw.githubusercontent.com/meituan-longcat/LongCat-Flash-Chat/main/figures/longcat_logo.svg"
13
  width="300"
@@ -62,7 +66,7 @@ Effectively and efficiently scaling model size remains a key challenge in strate
62
  #### 🌟 Multi-Stage Training Pipeline for Agentic Capability
63
  Through a meticulously designed pipeline, LongCat-Flash is endowed with advanced agentic behaviors. Initial efforts focus on constructing a more suitable base model for agentic post-training, where we design a two-stage pretraining data fusion strategy to concentrate reasoning-intensive domain data. During mid-training, we enhance reasoning and coding capabilities while extending the context length to 128k to meet agentic post-training requirements. Building on this advanced base model, we proceed with a multi-stage post-training. Recognizing the scarcity of high-quality, high-difficulty training problems for agentic tasks, we design a multi-agent synthesis framework that defines task difficulty across three axes, i.e., information processing, tool-set complexity, and user interaction—using specialized controllers to generate complex tasks requiring iterative reasoning and environmental interaction.
64
 
65
- For more detail, please refer to the comprehensive [***LongCat-Flash Technical Report***](https://github.com/meituan-longcat/LongCat-Flash-Chat/blob/main/tech_report.pdf).
66
 
67
  ## Evaluation Results
68
  | **Benchmark** | **DeepSeek V3.1** | **Qwen3 MoE-2507** | **Kimi-K2** | **GPT-4.1** | **Claude4 Sonnet** | **Gemini2.5 Flash** | **LongCat-Flash** |
@@ -220,5 +224,4 @@ We kindly encourage citation of our work if you find it useful.
220
 
221
 
222
  ## Contact
223
- Please contact us at <a href="mailto:longcat-team@meituan.com">longcat-team@meituan.com</a> or open an issue if you have any questions.
224
-
 
1
  ---
2
+ library_name: transformers
3
  license: mit
 
4
  pipeline_tag: text-generation
5
  tags:
6
  - transformers
 
8
 
9
  # LongCat-Flash-Chat
10
 
11
+ Paper: [LongCat-Flash Technical Report](https://huggingface.co/papers/2509.01322)
12
+ Project Page: https://longcat.ai
13
+ GitHub Repository: https://github.com/meituan-longcat/LongCat-Flash-Chat
14
+
15
  <div align="center">
16
  <img src="https://raw.githubusercontent.com/meituan-longcat/LongCat-Flash-Chat/main/figures/longcat_logo.svg"
17
  width="300"
 
66
  #### 🌟 Multi-Stage Training Pipeline for Agentic Capability
67
  Through a meticulously designed pipeline, LongCat-Flash is endowed with advanced agentic behaviors. Initial efforts focus on constructing a more suitable base model for agentic post-training, where we design a two-stage pretraining data fusion strategy to concentrate reasoning-intensive domain data. During mid-training, we enhance reasoning and coding capabilities while extending the context length to 128k to meet agentic post-training requirements. Building on this advanced base model, we proceed with a multi-stage post-training. Recognizing the scarcity of high-quality, high-difficulty training problems for agentic tasks, we design a multi-agent synthesis framework that defines task difficulty across three axes, i.e., information processing, tool-set complexity, and user interaction—using specialized controllers to generate complex tasks requiring iterative reasoning and environmental interaction.
68
 
69
+ For more detail, please refer to the comprehensive [***LongCat-Flash Technical Report***](https://huggingface.co/papers/2509.01322).
70
 
71
  ## Evaluation Results
72
  | **Benchmark** | **DeepSeek V3.1** | **Qwen3 MoE-2507** | **Kimi-K2** | **GPT-4.1** | **Claude4 Sonnet** | **Gemini2.5 Flash** | **LongCat-Flash** |
 
224
 
225
 
226
  ## Contact
227
+ Please contact us at <a href="mailto:longcat-team@meituan.com">longcat-team@meituan.com</a> or open an issue if you have any questions.