βοΈ Searchless Chess: ResNet-L (1711 ELO)
This is the large convolutional baseline of the Searchless Chess projectβa deep Residual Network (ResNet) that plays chess at a strong club level (~1711 ELO) relying purely on neural intuition, without any search algorithms (no minimax, no alpha-beta pruning).
- Architecture: ResNet-L (Residual Network)
- Parameters: 12.9M
- Estimated ELO: 1711
- Training Data: 316M positions from Lichess/Stockfish
π Performance Metrics
| Metric | Value |
|---|---|
| ELO (Estimated) | 1711 |
| Tier 4 Puzzles (1000-1250) | 76% Accuracy |
| Tier 6 Puzzles (1500-1750) | 59% Accuracy |
| Tier 8 Puzzles (2000-2250) | 23% Accuracy |
π― Why this model?
ResNet-L represents the traditional approach to chess position evaluation using deep convolutional layers. While it is a powerful model, it serves as a key benchmark in this project, demonstrating that even with 12.9M parameters, it is outperformed by the much smaller ViT-Small (2.64M). This comparison highlights the efficiency of global self-attention over local convolutions in understanding complex board dependencies.
π Quick Start & Usage
This repository provides everything needed to run or fine-tune the model:
model.keras: Complete model (architecture + weights) in Keras 3 format.model.weights.h5: Standalone weights.config.json: Model architecture configuration.
Which file to use?
- Use
model.kerasfor a quick "plug-and-play" experience, as it contains both the model architecture and the trained weights in one file. - Use
config.jsonandmodel.weights.h5if you prefer to build the model structure manually in your code and load only the parameters (e.g., for custom fine-tuning).
Using the model in Python
To use this model for playing chess, use the ChessAI wrapper provided in the official GitHub repository. It handles the FEN-to-Tensor encoding and move selection logic.
# 1. Clone the repository: git clone https://github.com/mateuszgrzyb-pl/searchless-chess
# 2. Use the following code:
import chess
from src.chess_ai.core.model import ChessAI
# Load the chess engine (provide path to the downloaded model.keras)
chess_bot = ChessAI('path/to/resnet-large/model.keras')
# Create a chess board
board = chess.Board()
# Play a game
board.push_san("e4") # Your move
engine_move = chess_bot.make_move(board) # Engine responds via neural intuition
board.push(engine_move)
print(board)
print(f"Engine played: {engine_move}")
For detailed instructions on board encoding, training, and evaluation, visit the Searchless Chess GitHub.
π¬ Model Architecture
This model is based on a Deep Residual Network (ResNet), a classic and powerful architecture for chess engines (inspired by AlphaZero). Unlike Transformers, it uses convolutional layers to detect spatial patterns and piece coordination through a stacked series of residual blocks.
- Input Shape: 8x8x12 (standard piece-centric representation)
- Residual Blocks: 10 (each containing two convolutional layers)
- Filters: 256 (providing wide feature detection)
- Dense Layer: 512 units (for final position evaluation)
- Regularization: Batch Normalization and 0.5 Dropout to ensure robust generalization
While ResNets are excellent at capturing local patterns like pawn chains or king safety, this project demonstrates that they require significantly more parameters than Vision Transformers to achieve similar global board understanding.
π Dataset
The model was trained on the Lichess Stockfish Normalized dataset, consisting of 316M deduplicated positions with evaluations.
π Citation
If you use this model in your research, please cite:
@software{grzyb2025searchless,
author = {Grzyb, Mateusz},
title = {Searchless Chess: Master-Level Chess Through Pure Neural Intuition},
year = {2025},
publisher = {Hugging Face},
url = {https://huggingface.co/mateuszgrzyb/searchless-chess-resnet-large}
}
π§ Contact
- Project Repository: GitHub
- LinkedIn: Mateusz Grzyb
- Website: MateuszGrzyb.pl
- Downloads last month
- 54