isaacchung Samoed Claude Sonnet 4 commited on
Commit
fc9f76d
·
unverified ·
1 Parent(s): 4c12129

Generate cached results (#376)

Browse files

* feat: add script and workflow to generate cached results for leaderboard

Adds generate_cached_results.py script to pre-generate __cached_results.json
containing all benchmark results. This optimizes leaderboard startup by
saving 100+ seconds on fresh builds.

The GitHub Actions workflow automatically regenerates this cache file
whenever new results are merged to main.

* refactor: compress cached results with gzip

Changes output from __cached_results.json to __cached_results.json.gz
using gzip compression. This reduces file size by ~80-90% while
maintaining fast decompression.

Updated both the generation script and workflow to handle gzipped files.

* Apply suggestions from code review

Co-authored-by: Roman Solomatin <samoed.roman@gmail.com>

* fix: make cached results compatible with leaderboard app

- Add step to create uncompressed __cached_results.json in mteb/leaderboard/
- Update change detection to track both compressed and uncompressed files
- Modify commit step to include both cache file formats
- Enhance reporting to show sizes of both cache files

This resolves the compatibility issue where app.py couldn't find
the cache file created by the generate script due to:
- File format mismatch (gzip vs plain JSON)
- Location mismatch (repo root vs mteb/leaderboard/)

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Sonnet 4 <noreply@anthropic.com>

* feat: commit cached results to cached-data branch

- Replace commit logic to push __cached_results.json.gz to cached-data branch
- Switch to orphaned cached-data branch to store only cache files
- Add timestamp and file size to commit messages for better tracking
- Include proper error handling and status reporting
- Use same pattern as update-cached-results.yml for consistency

This creates a dedicated branch for storing cached results separately
from the main codebase, making it easier to manage large cache files.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Sonnet 4 <noreply@anthropic.com>

* Preserve README.md file when updating cached-data branch

- Replace 'git rm -rf .' with selective file removal
- Exclude README.md and __cached_results.json.gz from removal
- Ensures branch documentation remains intact during workflow runs

* Use model_dump_json() instead of manual JSON serialization

- Replace model_dump(mode='json') + json.dumps() with direct model_dump_json()
- Simplifies code and makes it more idiomatic
- Combines serialization and writing steps for better readability

Addresses review comment: https://github.com/embeddings-benchmark/results/pull/376#discussion_r2645537795

---------

Co-authored-by: Roman Solomatin <samoed.roman@gmail.com>
Co-authored-by: Claude Sonnet 4 <noreply@anthropic.com>

.github/workflows/generate_cached_results.yml ADDED
@@ -0,0 +1,90 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ name: Generate Cached Results
2
+
3
+ on:
4
+ push:
5
+ branches: [main]
6
+ # Allow manual trigger for testing
7
+ workflow_dispatch:
8
+
9
+ jobs:
10
+ generate-cache:
11
+ runs-on: ubuntu-latest
12
+ steps:
13
+ - name: Free disk space
14
+ run: |
15
+ sudo rm -rf /usr/share/dotnet
16
+ sudo rm -rf /opt/ghc
17
+ sudo rm -rf /usr/local/share/boost
18
+ docker system prune -af
19
+
20
+ - name: Checkout repository
21
+ uses: actions/checkout@v4
22
+ with:
23
+ fetch-depth: 0
24
+ token: ${{ secrets.GITHUB_TOKEN }}
25
+
26
+ - name: Setup Python
27
+ uses: actions/setup-python@v5
28
+ with:
29
+ python-version: '3.10'
30
+ cache: 'pip'
31
+
32
+ - name: Install dependencies
33
+ run: |
34
+ pip install git+https://github.com/embeddings-benchmark/mteb.git
35
+
36
+ - name: Generate cached results
37
+ run: |
38
+ python scripts/generate_cached_results.py
39
+ env:
40
+ PYTHONUNBUFFERED: 1
41
+
42
+ - name: Configure Git
43
+ run: |
44
+ git config --global user.name "github-actions[bot]"
45
+ git config --global user.email "github-actions[bot]@users.noreply.github.com"
46
+
47
+ - name: Update cached-data branch
48
+ run: |
49
+ # Check if __cached_results.json.gz was created
50
+ if [ ! -f "__cached_results.json.gz" ]; then
51
+ echo "❌ Cached results file not found"
52
+ exit 1
53
+ fi
54
+
55
+ # Get file size for logging
56
+ FILE_SIZE=$(stat -f%z __cached_results.json.gz 2>/dev/null || stat -c%s __cached_results.json.gz)
57
+ echo "📦 Generated cache file: $(echo "scale=1; $FILE_SIZE/1024/1024" | bc -l)MB"
58
+
59
+ # Switch to cached-data branch (create if doesn't exist)
60
+ git checkout --orphan cached-data 2>/dev/null || git checkout cached-data
61
+
62
+ # Remove all files except README.md and the cached results
63
+ git ls-files | grep -v "README.md" | grep -v "__cached_results.json.gz" | xargs -r git rm 2>/dev/null || true
64
+
65
+ # Add only the cached results file
66
+ git add __cached_results.json.gz
67
+
68
+ # Check if there are changes to commit
69
+ if git diff --staged --quiet; then
70
+ echo "✅ No changes in cached results, skipping commit"
71
+ else
72
+ # Commit with timestamp and file size
73
+ TIMESTAMP=$(date -u '+%Y-%m-%d %H:%M:%S UTC')
74
+ COMMIT_MSG="Update cached results - $TIMESTAMP ($(echo "scale=1; $FILE_SIZE/1024/1024" | bc -l)MB)"
75
+ git commit -m "$COMMIT_MSG"
76
+
77
+ # Push to remote
78
+ git push origin cached-data
79
+ echo "✅ Successfully updated cached-data branch"
80
+ fi
81
+
82
+ - name: Report status
83
+ if: always()
84
+ run: |
85
+ if [ -f "__cached_results.json.gz" ]; then
86
+ FILE_SIZE=$(stat -f%z __cached_results.json.gz 2>/dev/null || stat -c%s __cached_results.json.gz)
87
+ echo "✅ Workflow completed. Cache file size: $(echo "scale=1; $FILE_SIZE/1024/1024" | bc -l)MB"
88
+ else
89
+ echo "❌ Workflow failed - no cache file generated"
90
+ fi
scripts/generate_cached_results.py ADDED
@@ -0,0 +1,95 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ #!/usr/bin/env python3
2
+ """
3
+ Script to generate __cached_results.json.gz for the MTEB leaderboard.
4
+
5
+ This pre-generates the cached results file that the leaderboard uses,
6
+ which can save 100+ seconds on fresh leaderboard builds.
7
+
8
+ Usage:
9
+ python generate_cached_results.py
10
+
11
+ Output:
12
+ Creates __cached_results.json.gz in the remote repo root directory
13
+ """
14
+
15
+ import gzip
16
+ import json
17
+ import logging
18
+ import sys
19
+ import time
20
+ from pathlib import Path
21
+
22
+ import mteb
23
+ from mteb.cache import ResultCache
24
+
25
+ logging.basicConfig(
26
+ level=logging.INFO,
27
+ format='[%(asctime)s] %(levelname)s - %(message)s',
28
+ datefmt='%H:%M:%S'
29
+ )
30
+ logger = logging.getLogger(__name__)
31
+
32
+
33
+ def generate_cached_results():
34
+ """Generate the cached results JSON file."""
35
+ start_time = time.time()
36
+
37
+ logger.info("Initializing ResultCache...")
38
+ cache = ResultCache(Path(__file__).parent.parent)
39
+
40
+ # The remote repo should already be cloned from previous runs
41
+ logger.info("Using existing remote results repository...")
42
+
43
+ # Load all model names
44
+ logger.info("Getting all model names...")
45
+ models_start = time.time()
46
+ all_model_names = [model_meta.name for model_meta in mteb.get_model_metas()]
47
+ models_time = time.time() - models_start
48
+ logger.info(f"Found {len(all_model_names)} models in {models_time:.2f}s")
49
+
50
+ # Load results for all models
51
+ logger.info("Loading results from cache...")
52
+ load_start = time.time()
53
+ all_results = cache.load_results(
54
+ models=all_model_names,
55
+ only_main_score=True,
56
+ require_model_meta=False,
57
+ include_remote=True,
58
+ )
59
+ load_time = time.time() - load_start
60
+ logger.info(f"Loaded results in {load_time:.2f}s")
61
+
62
+ # Serialize to JSON and write to gzip file
63
+ repo_root = Path(__file__).parent.parent
64
+ output_path = repo_root / "__cached_results.json.gz"
65
+ logger.info(f"Serializing to JSON and writing to {output_path}...")
66
+ write_start = time.time()
67
+ json_str = all_results.model_dump_json()
68
+ with gzip.open(output_path, 'wt', encoding='utf-8') as f:
69
+ f.write(json_str)
70
+ write_time = time.time() - write_start
71
+ logger.info(f"Serialized and written in {write_time:.2f}s")
72
+
73
+ # Report file size
74
+ file_size_mb = output_path.stat().st_size / (1024 * 1024)
75
+ uncompressed_size_mb = len(json_str) / (1024 * 1024)
76
+ compression_ratio = (1 - file_size_mb / uncompressed_size_mb) * 100
77
+ logger.info(f"Generated {output_path} ({file_size_mb:.1f} MB)")
78
+ logger.info(f"Uncompressed size: {uncompressed_size_mb:.1f} MB")
79
+ logger.info(f"Compression ratio: {compression_ratio:.1f}%")
80
+
81
+ total_time = time.time() - start_time
82
+ logger.info(f"Total time: {total_time:.2f}s")
83
+
84
+ return output_path
85
+
86
+
87
+ if __name__ == "__main__":
88
+ try:
89
+ output_file = generate_cached_results()
90
+ logger.info(f"✅ Success! Generated {output_file}")
91
+ except Exception as e:
92
+ logger.error(f"❌ Failed: {e}")
93
+ import traceback
94
+ traceback.print_exc()
95
+ sys.exit(1)