Graph Machine Learning
yilunliao commited on
Commit
08fd201
·
verified ·
1 Parent(s): 633adcd

minor update on the wording

Browse files
Files changed (1) hide show
  1. README.md +7 -4
README.md CHANGED
@@ -10,9 +10,12 @@ pipeline_tag: graph-ml
10
  <a href="https://arxiv.org/abs/2604.09130" style="color: #1a73e8; font-weight: bold; font-size: 20px;">Paper</a>
11
  </p>
12
 
13
- This repository contains the checkpoints for **EquiformerV3**, the third generation of the $SE(3)$-equivariant graph attention Transformer. EquiformerV3 is designed to advance efficiency, expressivity, and generality in 3D atomistic modeling.
14
-
15
- Building on EquiformerV2, this version introduces software optimizations achieving a $1.75\times$ speedup, structural improvements like equivariant merged layer normalization and smooth-cutoff attention, and SwiGLU-$S^2$ activations to incorporate many-body interactions while preserving strict equivariance. EquiformerV3 achieves state-of-the-art results on benchmarks including OC20, OMat24, and Matbench Discovery.
 
 
 
16
 
17
  Please refer to the [official GitHub repository](https://github.com/atomicarchitects/equiformer_v3) for detailed instructions on environment setup and usage.
18
 
@@ -96,7 +99,7 @@ Training consists of (1) direct pre-training on OMat24, (2) gradient fine-tuning
96
 
97
  ## Citation
98
 
99
- If you find this work helpful, please consider citing:
100
 
101
  ```bibtex
102
  @article{equiformer_v3,
 
10
  <a href="https://arxiv.org/abs/2604.09130" style="color: #1a73e8; font-weight: bold; font-size: 20px;">Paper</a>
11
  </p>
12
 
13
+ This repository contains the checkpoints for **EquiformerV3**, the third generation of the SE(3)-equivariant graph attention Transformer.
14
+ EquiformerV3 is designed to advance efficiency, expressivity, and generality in 3D atomistic modeling.
15
+ Building on [EquiformerV2](https://arxiv.org/abs/2306.12059), this version introduces (1) software optimizations,
16
+ (2) simple and effective modifications like equivariant merged layer normalization and attention with smooth cutoff, and
17
+ (3) SwiGLU-S^2 activations, which incorporate many-body interactions and preserve strict equivariance.
18
+ EquiformerV3 achieves state-of-the-art results on benchmarks including OC20, OMat24, and Matbench Discovery.
19
 
20
  Please refer to the [official GitHub repository](https://github.com/atomicarchitects/equiformer_v3) for detailed instructions on environment setup and usage.
21
 
 
99
 
100
  ## Citation
101
 
102
+ Please consider citing this work below if it is helpful:
103
 
104
  ```bibtex
105
  @article{equiformer_v3,