vdmbrsv commited on
Commit
107ca80
·
verified ·
1 Parent(s): 070358d

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +4 -0
README.md CHANGED
@@ -173,6 +173,10 @@ Faust-1 uses a custom tokenizer optimized for German morphology and compounding.
173
 
174
  Lower token counts on German text translate directly into more usable context, lower inference cost, and less fragmentation on compound-heavy inputs.
175
 
 
 
 
 
176
  ---
177
 
178
  ## German benchmark performance
 
173
 
174
  Lower token counts on German text translate directly into more usable context, lower inference cost, and less fragmentation on compound-heavy inputs.
175
 
176
+
177
+ <img src="tokenizer_faust.png" alt="Faust-1 vs OpenAI Tokenizers" width="800">
178
+
179
+
180
  ---
181
 
182
  ## German benchmark performance