| | --- |
| | license: apache-2.0 |
| | datasets: |
| | - bigai/TongSIM-Asset |
| | language: |
| | - en |
| | metrics: |
| | - exact_match |
| | new_version: zai-org/GLM-4.7 |
| | pipeline_tag: reinforcement-learning |
| | library_name: transformers |
| | tags: |
| | - physics |
| | - chemistry |
| | - deepmind |
| | --- |
| | |
| |
|
| | # PsiFormer Checkpoint: Hydrogen → Oxygen |
| |
|
| | This repository contains pretrained **PsiFormer** checkpoints for electronic-structure modeling across atomic systems ranging from **Hydrogen (Z=1)** to **Oxygen (Z=8)**. |
| |
|
| | The model is designed for **variational quantum Monte Carlo (VMC)**–style wavefunction modeling, with a Transformer-based architecture that captures electron–electron correlations efficiently and scalably. |
| |
|
| | --- |
| |
|
| | ## Model Overview |
| |
|
| | - **Architecture**: PsiFormer (Transformer-based wavefunction ansatz) |
| | - **Task**: Electronic wavefunction approximation |
| | - **Method**: Variational Monte Carlo (VMC) |
| | - **Atomic range**: Hydrogen → Oxygen |
| | - **Framework**: PyTorch |
| | - **Precision**: FP32 (unless otherwise specified) |
| |
|
| | The model outputs parameters of a many-body wavefunction that can be used to estimate ground-state energies and other observables via Monte Carlo sampling. |
| |
|
| | --- |
| |
|
| | ## Training Details |
| |
|
| | - **Systems**: Isolated atoms with atomic numbers Z = 1–8 |
| | - **Electrons**: Corresponding neutral configurations |
| | - **Optimization**: Stochastic gradient–based optimization of variational energy |
| | - **Sampling**: Metropolis–Hastings MCMC |
| | - **Objective**: Minimize the expectation value of the Hamiltonian |
| |
|
| | Exact hyperparameters (learning rate, batch size, number of walkers, etc.) should be considered checkpoint-specific and are documented in the accompanying configuration files when available. |
| |
|
| | --- |
| |
|
| | ## Intended Use |
| |
|
| | This checkpoint is intended for: |
| |
|
| | - Initializing PsiFormer models for light atoms |
| | - Transfer learning to larger atoms or small molecules |
| | - Benchmarking neural quantum states |
| | - Research and educational purposes in computational quantum physics |
| |
|
| | It is **not** intended for production chemistry workflows without further validation. |
| |
|
| | --- |
| |
|
| | ## Example Usage |
| |
|
| | ```python |
| | import torch |
| | from psiformer import PsiFormer |
| | |
| | model = PsiFormer(...) |
| | state_dict = torch.load("psiformer_h_to_o.pt", map_location="cpu") |
| | model.load_state_dict(state_dict) |
| | model.eval() |
| | ```` |
| |
|
| | Refer to the PsiFormer repository for full examples including sampling and energy evaluation. |
| |
|
| | --- |
| |
|
| | ## Limitations |
| |
|
| | * Trained only on **isolated atoms**, not molecules |
| | * Accuracy degrades outside the Z = 1–8 range |
| | * Performance depends strongly on sampling quality and optimization setup |
| | * No relativistic or spin–orbit effects included |
| |
|
| | --- |
| |
|
| | ## Citation |
| |
|
| | If you use this checkpoint in academic work, please cite the corresponding PsiFormer paper or repository. |
| |
|
| | ```bibtex |
| | @misc{psiformer, |
| | title={PsiFormer: Transformer-based Neural Quantum States}, |
| | author={...}, |
| | year={202X} |
| | } |
| | ``` |
| |
|
| | --- |
| |
|
| | ## License |
| |
|
| | Specify the license here (e.g. MIT, Apache 2.0, custom research license). |
| |
|
| | --- |
| |
|
| | ## Contact |
| |
|
| | For questions, issues, or collaborations, please open an issue in the main PsiFormer repository. |
| |
|
| |
|
| |
|
| |
|