Improve model card: Add metadata, paper link, code link, and project page link

#3
by nielsr HF Staff - opened
Files changed (1) hide show
  1. README.md +19 -8
README.md CHANGED
@@ -1,3 +1,14 @@
 
 
 
 
 
 
 
 
 
 
 
1
  # Fun-ASR
2
 
3
  ใ€Œ[็ฎ€ไฝ“ไธญๆ–‡](README_zh.md)ใ€|ใ€ŒEnglishใ€
@@ -158,13 +169,13 @@ We evaluated Fun-ASR against other state-of-the-art models on open-source benchm
158
  | **Model Size** | 1.5B | 1.5B | 1.6B | - | - | - | - | 1.1B | 0.8B | 7.7B |
159
  | **OpenSource** | โœ… | โœ… | โœ… | โŒ | โŒ | โœ… | โœ… | โœ… | โœ… | โŒ |
160
  | AIShell1 | 1.81 | 2.17 | 4.72 | 0.68 | 1.63 | 0.71 | 0.63 | 0.54 | 1.80 | 1.22 |
161
- | AIShell2 | - | 3.47 | 4.68 | 2.27 | 2.76 | 2.86 | 2.10 | 2.58 | 2.75 | 2.39 |
162
- | Fleurs-zh | - | 3.65 | 5.18 | 3.43 | 3.23 | 3.11 | 2.68 | 4.81 | 2.56 | 2.53 |
163
- | Fleurs-en | 5.78 | 6.95 | 6.23 | 9.39 | 9.39 | 6.99 | 3.03 | 10.79 | 5.96 | 4.74 |
164
- | Librispeech-clean | 2.00 | 2.17 | 1.86 | 1.58 | 2.8 | 1.32 | 1.17 | 1.84 | 1.76 | 1.51 |
165
- | Librispeech-other | 4.19 | 4.43 | 3.43 | 2.84 | 5.69 | 2.63 | 2.42 | 4.52 | 4.33 | 3.03 |
166
- | WenetSpeech Meeting | 6.73 | 8.21 | 18.39 | 5.69 | 7.07 | 6.24 | 4.75 | 4.95 | 6.60 | 6.17 |
167
- | WenetSpeech Net | - | 6.33 | 11.89 | 4.66 | 4.84 | 6.45 | 4.67 | 4.94 | 6.01 | 5.46 |
168
 
169
  > _Note: Seed-ASR\* results are evaluated using the official API on volcengine; GLM-ASR-nano\* results are evaluated using the open-source checkpoint._
170
 
@@ -198,4 +209,4 @@ We evaluated Fun-ASR against other state-of-the-art models on open-source benchm
198
  journal={arXiv preprint arXiv:2509.12508},
199
  year={2025}
200
  }
201
- ```
 
1
+ ---
2
+ license: apache-2.0
3
+ pipeline_tag: automatic-speech-recognition
4
+ library_name: funasr
5
+ ---
6
+
7
+ This repository contains the model presented in [Fun-ASR Technical Report](https://huggingface.co/papers/2509.12508).
8
+
9
+ Code: https://github.com/FunAudioLLM/Fun-ASR
10
+ Project Page: https://funaudiollm.github.io/funasr
11
+
12
  # Fun-ASR
13
 
14
  ใ€Œ[็ฎ€ไฝ“ไธญๆ–‡](README_zh.md)ใ€|ใ€ŒEnglishใ€
 
169
  | **Model Size** | 1.5B | 1.5B | 1.6B | - | - | - | - | 1.1B | 0.8B | 7.7B |
170
  | **OpenSource** | โœ… | โœ… | โœ… | โŒ | โŒ | โœ… | โœ… | โœ… | โœ… | โŒ |
171
  | AIShell1 | 1.81 | 2.17 | 4.72 | 0.68 | 1.63 | 0.71 | 0.63 | 0.54 | 1.80 | 1.22 |
172
+ | AIShell2 | - | 3.47 | 4.68 | 2.27 | 2.76 | 2.86 | 2.10 | 2.58 | 2.75 | 2.39 |
173
+ | Fleurs-zh | - | 3.65 | 5.18 | 3.43 | 3.23 | 3.11 | 2.68 | 4.81 | 2.56 | 2.53 |
174
+ | Fleurs-en | 5.78 | 6.95 | 6.23 | 9.39 | 9.39 | 6.99 | 3.03 | 10.79 | 5.96 | 4.74 |
175
+ | Librispeech-clean | 2.00 | 2.17 | 1.86 | 1.58 | 2.8 | 1.32 | 1.17 | 1.84 | 1.76 | 1.51 |
176
+ | Librispeech-other | 4.19 | 4.43 | 3.43 | 2.84 | 5.69 | 2.63 | 2.42 | 4.52 | 4.33 | 3.03 |
177
+ | WenetSpeech Meeting | 6.73 | 8.21 | 18.39 | 5.69 | 7.07 | 6.24 | 4.75 | 4.95 | 6.60 | 6.17 |
178
+ | WenetSpeech Net | - | 6.33 | 11.89 | 4.66 | 4.84 | 6.45 | 4.67 | 4.94 | 6.01 | 5.46 |
179
 
180
  > _Note: Seed-ASR\* results are evaluated using the official API on volcengine; GLM-ASR-nano\* results are evaluated using the open-source checkpoint._
181
 
 
209
  journal={arXiv preprint arXiv:2509.12508},
210
  year={2025}
211
  }
212
+ ```