jorgemunozl commited on
Commit
dfa2a34
·
verified ·
1 Parent(s): b356bd5

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +115 -0
README.md ADDED
@@ -0,0 +1,115 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ datasets:
4
+ - bigai/TongSIM-Asset
5
+ language:
6
+ - en
7
+ metrics:
8
+ - exact_match
9
+ new_version: zai-org/GLM-4.7
10
+ pipeline_tag: reinforcement-learning
11
+ library_name: transformers
12
+ tags:
13
+ - physics
14
+ - chemistry
15
+ - deepmind
16
+ ---
17
+
18
+
19
+ # PsiFormer Checkpoint: Hydrogen → Oxygen
20
+
21
+ This repository contains pretrained **PsiFormer** checkpoints for electronic-structure modeling across atomic systems ranging from **Hydrogen (Z=1)** to **Oxygen (Z=8)**.
22
+
23
+ The model is designed for **variational quantum Monte Carlo (VMC)**–style wavefunction modeling, with a Transformer-based architecture that captures electron–electron correlations efficiently and scalably.
24
+
25
+ ---
26
+
27
+ ## Model Overview
28
+
29
+ - **Architecture**: PsiFormer (Transformer-based wavefunction ansatz)
30
+ - **Task**: Electronic wavefunction approximation
31
+ - **Method**: Variational Monte Carlo (VMC)
32
+ - **Atomic range**: Hydrogen → Oxygen
33
+ - **Framework**: PyTorch
34
+ - **Precision**: FP32 (unless otherwise specified)
35
+
36
+ The model outputs parameters of a many-body wavefunction that can be used to estimate ground-state energies and other observables via Monte Carlo sampling.
37
+
38
+ ---
39
+
40
+ ## Training Details
41
+
42
+ - **Systems**: Isolated atoms with atomic numbers Z = 1–8
43
+ - **Electrons**: Corresponding neutral configurations
44
+ - **Optimization**: Stochastic gradient–based optimization of variational energy
45
+ - **Sampling**: Metropolis–Hastings MCMC
46
+ - **Objective**: Minimize the expectation value of the Hamiltonian
47
+
48
+ Exact hyperparameters (learning rate, batch size, number of walkers, etc.) should be considered checkpoint-specific and are documented in the accompanying configuration files when available.
49
+
50
+ ---
51
+
52
+ ## Intended Use
53
+
54
+ This checkpoint is intended for:
55
+
56
+ - Initializing PsiFormer models for light atoms
57
+ - Transfer learning to larger atoms or small molecules
58
+ - Benchmarking neural quantum states
59
+ - Research and educational purposes in computational quantum physics
60
+
61
+ It is **not** intended for production chemistry workflows without further validation.
62
+
63
+ ---
64
+
65
+ ## Example Usage
66
+
67
+ ```python
68
+ import torch
69
+ from psiformer import PsiFormer
70
+
71
+ model = PsiFormer(...)
72
+ state_dict = torch.load("psiformer_h_to_o.pt", map_location="cpu")
73
+ model.load_state_dict(state_dict)
74
+ model.eval()
75
+ ````
76
+
77
+ Refer to the PsiFormer repository for full examples including sampling and energy evaluation.
78
+
79
+ ---
80
+
81
+ ## Limitations
82
+
83
+ * Trained only on **isolated atoms**, not molecules
84
+ * Accuracy degrades outside the Z = 1–8 range
85
+ * Performance depends strongly on sampling quality and optimization setup
86
+ * No relativistic or spin–orbit effects included
87
+
88
+ ---
89
+
90
+ ## Citation
91
+
92
+ If you use this checkpoint in academic work, please cite the corresponding PsiFormer paper or repository.
93
+
94
+ ```bibtex
95
+ @misc{psiformer,
96
+ title={PsiFormer: Transformer-based Neural Quantum States},
97
+ author={...},
98
+ year={202X}
99
+ }
100
+ ```
101
+
102
+ ---
103
+
104
+ ## License
105
+
106
+ Specify the license here (e.g. MIT, Apache 2.0, custom research license).
107
+
108
+ ---
109
+
110
+ ## Contact
111
+
112
+ For questions, issues, or collaborations, please open an issue in the main PsiFormer repository.
113
+
114
+
115
+