ComparEdge commited on
Commit
f44ebf4
·
verified ·
1 Parent(s): ced2896

Upload llm-benchmarks-2026.csv with huggingface_hub

Browse files
Files changed (1) hide show
  1. llm-benchmarks-2026.csv +23 -0
llm-benchmarks-2026.csv ADDED
@@ -0,0 +1,23 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ provider,model,mmlu_score,humaneval_score,math_score,arena_elo,coding_rank,reasoning_rank,multilingual_rank,overall_tier
2
+ OpenAI,gpt-4o,88.7,90.2,76.6,1287,2,3,2,S
3
+ OpenAI,gpt-4o-mini,82.0,87.0,70.2,1140,5,8,5,A
4
+ OpenAI,o1,92.3,92.4,96.4,1350,1,1,3,S+
5
+ OpenAI,o3-mini,86.9,89.5,94.8,1310,3,2,6,S
6
+ Anthropic,claude-sonnet-4,88.5,93.7,78.3,1295,1,3,2,S
7
+ Anthropic,claude-haiku-3.5,79.8,88.1,69.5,1160,6,7,4,A
8
+ Anthropic,claude-opus-4,90.2,95.1,82.7,1330,1,2,1,S+
9
+ Google,gemini-2.5-pro,90.8,89.8,86.2,1340,2,1,1,S+
10
+ Google,gemini-2.5-flash,85.1,86.4,78.0,1250,4,4,3,S
11
+ Google,gemini-2.0-flash,82.5,84.2,72.1,1190,7,6,4,A
12
+ Meta,llama-3.3-70b,82.0,81.7,68.0,1180,8,8,7,A
13
+ Meta,llama-3.1-405b,86.1,84.3,73.8,1230,5,5,5,A+
14
+ Meta,llama-4-maverick,87.5,86.0,77.2,1260,4,4,4,S
15
+ Mistral,mistral-large,84.0,82.5,72.0,1200,6,6,3,A+
16
+ Mistral,mistral-small,78.5,79.0,65.3,1120,9,9,6,B+
17
+ Mistral,codestral,75.2,90.8,58.4,1170,2,10,8,A
18
+ DeepSeek,deepseek-v3,87.1,82.6,84.0,1280,4,3,5,S
19
+ DeepSeek,deepseek-r1,89.5,85.2,94.3,1320,3,1,6,S+
20
+ Cohere,command-r-plus,80.2,72.5,62.1,1130,10,9,2,B+
21
+ Cohere,command-r,75.8,68.3,55.7,1050,12,11,4,B
22
+ xAI,grok-2,85.5,83.0,71.5,1220,5,5,6,A+
23
+ xAI,grok-3-mini,81.0,80.5,78.9,1200,7,4,7,A