← Back to papers

Paper deep dive

An SO(3)-equivariant reciprocal-space neural potential for long-range interactions

Linfeng Zhang, Taoyong Cui, Dongzhan Zhou, Lei Bai, Sufei Zhang, Luca Rossi, Mao Su, Wanli Ouyang, Pheng-Ann Heng

Year: 2026Venue: arXiv preprintArea: physics.chem-phType: PreprintEmbeddings: 46

Abstract

Abstract:Long-range electrostatic and polarization interactions play a central role in molecular and condensed-phase systems, yet remain fundamentally incompatible with locality-based machine-learning interatomic potentials. Although modern SO(3)-equivariant neural potentials achieve high accuracy for short-range chemistry, they cannot represent the anisotropic, slowly decaying multipolar correlations governing realistic materials, while existing long-range extensions either break SO(3) equivariance or fail to maintain energy-force consistency. Here we introduce EquiEwald, a unified neural interatomic potential that embeds an Ewald-inspired reciprocal-space formulation within an irreducible SO(3)-equivariant framework. By performing equivariant message passing in reciprocal space through learned equivariant k-space filters and an equivariant inverse transform, EquiEwald captures anisotropic, tensorial long-range correlations without sacrificing physical consistency. Across periodic and aperiodic benchmarks, EquiEwald captures long-range electrostatic behavior consistent with ab initio reference data and consistently improves energy and force accuracy, data efficiency, and long-range extrapolation. These results establish EquiEwald as a physically principled paradigm for long-range-capable machine-learning interatomic potentials.

Tags

ai-safety (imported, 100%)physicschem-ph (suggested, 92%)preprint (suggested, 88%)

Links

Your browser cannot display the PDF inline. Open PDF directly →

Intelligence

Status: not_run | Model: - | Prompt: - | Confidence: 0%

Entities (0)

No extracted entities yet.

Relation Signals (0)

No relation signals yet.

Cypher Suggestions (0)

No Cypher suggestions yet.

Full Text

45,569 characters extracted from source content.

Expand or collapse full text

An SO(3)-equivariant reciprocal-space neural potential for long-range interactions Linfeng Zhang 1,3† , Taoyong Cui 1† , Dongzhan Zhou 2 , Lei Bai 2 , Shufei Zhang 2 , Luca Rossi 3* , Mao Su 2,4* , Wanli Ouyang 1,2* , Pheng-Ann Heng 1 1 The Chinese University of Hong Kong, Hong Kong, 999077, China. 2 Shanghai Artificial Intelligence Laboratory, Shanghai, 200232, China. 3 Department of Electrical and Electronic Engineering, The Hong Kong Polytechnic University, Hong Kong, 999077, China. 4 Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen, 518055, China. *Corresponding author(s). E-mail(s): luca.rossi@polyu.edu.hk; sumao@pjlab.org.cn; wlouyang@ie.cuhk.edu.hk; † These authors contributed equally to this work. Abstract Long-range electrostatic and polarization interactions play a central role in molecular and condensed-phase systems, yet remain fundamentally incompatible with locality-based machine-learning interatomic potentials. Although mod- ernSO(3)-equivariant neural potentials achieve high accuracy for short-range chemistry, they cannot represent the anisotropic, slowly decaying multipolar cor- relations governing realistic materials, while existing long-range extensions either breakSO(3) equivariance or fail to maintain energy–force consistency. Here we introduce EquiEwald, a unified neural interatomic potential that embeds an Ewald-inspired reciprocal-space formulation within an irreducibleSO(3)- equivariant framework. By performing equivariant message passing in reciprocal space through learned equivariant k-space filters and an equivariant inverse transform, EquiEwald captures anisotropic, tensorial long-range correlations without sacrificing physical consistency. Across periodic and aperiodic bench- marks, EquiEwald captures long-range electrostatic behavior consistent with ab initio reference data and consistently improves energy and force accuracy, data efficiency, and long-range extrapolation. These results establish EquiEwald 1 arXiv:2603.18389v1 [physics.chem-ph] 19 Mar 2026 as a physically principled paradigm for long-range–capable machine-learning interatomic potentials. Keywords: Machine Learning Interactomic Potentials, Long Range Dependence, Graph Neural Network 1 Introduction Molecular dynamics (MD) simulation has become an indispensable tool for probing atomistic mechanisms underlying chemical reactivity, condensed-phase organization, materials discovery, and biomolecular function [1–5]. Yet despite decades of progress, achieving quantum-level accuracy at the spatial and temporal scales required for realistic materials and molecular systems remains a central challenge. Quantum- mechanical methods such as density functional theory (DFT) deliver reliable energies and forces [6], but their steep computational cost restricts system sizes and timescales. Classical force fields, while efficient, often lack the transferability and predictive fidelity needed for complex chemistries [7]. Machine-learning interatomic potentials (MLIPs) aim to overcome this trade-off by learning high-dimensional potential energy surfaces from ab initio data [8–11], enabling simulations that approach quantum accuracy with near–force-field efficiency [12]. Within this landscape, SO(3)-equivariant graph neural networks, including NequIP [13], Allegro [14], and MACE [15], have emerged as state-of-the-art short-range MLIPs. By explicitly encoding rotational and translational symmetries, these mod- els achieve remarkable data efficiency and robustness across chemical space. However, their accuracy fundamentally relies on a strict locality assumption: the total energy is decomposed into atomic contributions determined solely by environments within a finite cutoff radius [16–18]. This assumption is intrinsically incompatible with systems dominated by long-range interactions, such as electrostatics, dipole–dipole coupling, and collective polarization, which decay slowly and exhibit pronounced anisotropy [19– 21]. As a result, even the most advanced equivariant architectures struggle to represent the tensorial, multipolar correlations governing electrolytes, molecular crystals, and interfacial systems [16, 22, 23]. This limitation is not merely quantitative but represen- tational: long-range physics cannot be faithfully captured by truncating interactions in real space, regardless of local model expressivity [24]. At its core, this reflects a mismatch between the tensorial symmetry of long-range physical interactions and the truncated or scalar representations imposed by locality-based learning. Efforts to incorporate long-range interactions into MLIPs have continued for years. Early work augments short-range models with empirical electrostatics and dispersion baselines, which enforce the correct asymptotic behavior but are typically system- dependent and hard to obtain for complex, heterogeneous environments [25, 26]. A more general line of work learns intermediate physical surrogates, most commonly effective partial charges or electronic proxies such as Wannier-center-based representa- tions, and then evaluates long-range electrostatics explicitly [27–31]. These approaches are easier to interpret and can capture polarization effects, but they often require 2 additional supervision and remain sensitive to the non-uniqueness of charge partition- ing. Message-passing models [13, 32, 33] extend the receptive field by stacking local convolutions, yet truly nonlocal couplings, such as interactions between fragments separated beyond the cutoff, are still difficult to capture. Inspired by Ewald summa- tion, more recent work models long-range effects in reciprocal space by aggregating global interactions through Fourier components with learnable frequency filters [34– 36]. However, these frameworks encode reciprocal-space signals using scalar structure factors or latent charge-like variables, which average over angular information and can limit the description of directional electrostatics, multipolar correlations, and anisotropic polarization. LODE methods [24, 37] provide a complementary strat- egy by using local descriptors to parameterize Coulomb and more general 1/r p far-field interactions. This formulation preserves analytic asymptotic behavior and enables the incorporation of physically motivated long-range terms within equivariant frameworks. However, because the long-range contributions are introduced through predefined functional forms, these approaches can be less flexible in capturing com- plex environment-dependent electronic correlations, and they do not fully integrate long-range interactions into a unified end-to-end learning architecture. Here, we introduce EquiEwald, a unified SO(3)-equivariant neural interatomic potential that embeds Ewald summation within an irreducible representation (irrep) framework [38]. EquiEwald replaces the conventional locality assumption with a degree-resolved reciprocal-space pathway that enables long-range interactions to be represented at the tensorial level. By performing SO(3)-equivariant message passing directly in reciprocal space, the model can capture anisotropic and non-local cor- relations while preserving exact rotational symmetry and energy–force consistency. Specifically, EquiEwald computes irrep-resolved structure factors, applies learned k-space filters under both periodic and aperiodic boundary conditions, and maps spectral signals back to atomic space through an equivariant inverse transform, yield- ing a unified treatment of real-space and reciprocal-space physics within a single differentiable architecture. Across a suite of demanding benchmarks, we show that this representation-level integration of reciprocal-space information leads to improved accuracy, data efficiency, and transferability in systems governed by non-local interac- tions. While EquiEwald does not aim to replace explicit electrostatic models, analytic Ewald formulations, or environment-dependent dielectric descriptions, it provides a physically structured, equivariant alternative for learning long-range interactions directly from ab initio data, addressing a fundamental representational limitation of locality-based MLIPs. 2 Results 2.1 Preliminaries MLIPs aim to learn mappings from atomic structures to physical observables such as total energy E and forces F i = −∇ r i E. In the absence of external fields, the underlying physical laws are invariant under global translations, rotations, and per- mutations of atoms with identical species. To ensure physical consistency and improve 3 Input Atoms Spherical Harmonics Embedding Short-Range Block Long-Range Ewald Block Short-Long-Range Information Fusion Output Head 풓 ! ,풌 SO3_Linear GATE 풓 ! ℓ 풌,풓 # Decomposed Irreps Ewald Block Real Imag 풌_space Filter 풌_space Filter Inverse Fourier transform MLP CONCAT ×ℓ !"# 풉 # =ℎ # $ ,ℎ # % ,...,ℎ # ℓ !"# ... 퐸 , ,퐹 , ×푁 GATE 푥 ! &'()* 푥 ! +,)*- ⨁ Scaler Normalization 푖: Target Node 푗: Source Node ℓ: Irreps Degree (a) (b) Fig. 1: Overview. a, The whole model structure of EquiEwald. b, The EquiEwald long-range block takes node features and wave-vector inputs, applies an SO(3) linear map and gating, then decomposes degree-ℓ irreps into real/imag branches with k-space filters driven by ⟨k, r j ⟩; an inverse Fourier transform and an MLP return real-space updates per degree. Outputs across degrees are concatenated and gated, then fused with normalized local features to yield a rotationally equivariant long-range interaction update. generalization, modern neural potentials typically enforce equivariance to these sym- metries [13, 14]. In this work, we focus on SO(3)-equivariance, which requires that the learned atomic features transform as irreducible spherical tensors under global 3D rotation [39, 40]. Formally, for a rotation matrix R ∈ SO(3) and atomic positions r i N i=1 , an ℓ-type atomic representation x (ℓ) i ∈ C 2ℓ+1 must satisfy x (ℓ) i 7→ x (ℓ)′ i = D (ℓ) (R) x (ℓ) i ,(1) where D (ℓ) (R) is the Wigner-D matrix of degree ℓ [41]. A formal proof demonstrating the SO(3) equivariance of each step within the EquiEwald module is provided in Sup- plementary Section 2. This property ensures that energy predictions remain invariant (E ′ = E) and that forces transform as proper vectors (F ′ i = RF i ) under arbitrary global rotation [42]. Throughout this paper, we use this formulation to build rotation-equivariant neural architectures. Our proposed long-range module, EquiEwald, operates on such irreducible SO(3) representations in reciprocal space and preserves equivari- ance via spherical harmonic decomposition, frequency-domain filtering, and inverse accumulation, as detailed in the following sections. 4 2.2 Overall structure of EquiEwald EquiEwald is a neural framework for learning interatomic potentials that unifies short-range geometric modeling with long-range physical interactions in a single, rotation-equivariant architecture. As shown in Fig. 1(a), it consists of two synergistic representation pathways: a short-range encoder that captures local atomic environ- ments using a local graph message passing backbone, and a long-range spectral encoder that introduces non-local information via message passing in reciprocal space. Both components operate on the same atom-wise input and their outputs are fused through a residual update to form the final atomic representation used for energy and force prediction. A central feature of the EquiEwald design, illustrated in Fig. 1(b), is its use of high harmonic spherical degrees to organize and propagate information in reciprocal space. This enables the model to resolve anisotropic, direction-dependent, and multipolar correlations that are critical for accurately modeling long-range interactions in systems with electrostatics, polarization, or delocalized electronic structure. Unlike purely local models that rely on cutoff-based neighborhoods, EquiEwald captures extended spatial dependencies through equivariant Fourier accumulation and degree-resolved filtering, maintaining SO(3) symmetry throughout. By embedding physically motivated priors into the representation space, EquiEwald provides a unified treatment of short-range and long-range physics effects. This design improves model accuracy, long-range extrapolation, and data efficiency across both periodic and aperiodic systems, including charged dimers, conjugated molecules, supramolecular assemblies, and biomolecular dynamics. 2.3 Experimental results The effectiveness of EquiEwald was systematically evaluated on both aperiodic and periodic systems. For the main results, we adopted the eSCN framework [43] as our primary method. For periodic systems, we additionally included EquiformerV2 [44] as another baseline. Aperiodic benchmarks included molecular dimer [36], AIMD- Chig [45], and buckyball catcher [46] datasets, while periodic systems were assessed using the OC20 dataset [47]. Molecular Dimer. This system presents a prototypical case of long-range electro- static interaction between spatially separated charged species. It comprises a cationic C 4 N 2 H 6 and an anionic C 3 NOH 7 molecule, forming a loosely bound dimer where intermolecular interactions are governed predominantly by long-range dipole-dipole forces. The training configurations cover centroid separations from 5 to 12 ̊ A, while the test set focuses on even more distant geometries extending from 12 to 15 ̊ A. At these distances, the two fragments lie well beyond the 5 ̊ A local cutoff, rendering them effectively disconnected in standard message-passing architectures. Figure 2 compares the performance of four modeling approaches. The baseline eSCN model (Fig. 2a) does not capture the correct asymptotic decay of the interaction energy and yields a large MAE of 21.08 meV. As interactions beyond the local cutoff are not modeled, the pre- dicted energies level off too early and break down in the extrapolative test set. Two alternative long-range methods are evaluated to address this issue (Fig. 2b and c). 5 These methods substantially reduce the error, achieving MAEs of 1.18 meV and 2.28 meV, respectively, but visible deviations from the reference values remain at larger separations. In contrast, the eSCN + EquiEwald model (Fig. 2d) produces accurate energy predictions across the full range of separations, including the extrapolative test configurations, with a MAE of 0.78 meV. We also compared the results for another pair of molecular dimers. A detailed comparison of the energy and force MAE between eSCN and eSCN + EquiEwald is given in Supplementary Fig. S1. These results show that EquiEwald overcomes the limitations of scalar methods in modeling long- range electrostatic interactions by incorporating higher-order tensor representations, enabling accurate capture of anisotropic and multipolar interactions, and significantly improving the modeling accuracy and physical consistency for molecular systems. 6 Fig. 2: Benchmark comparison between short-range and long-range mod- els on charged molecular dimer systems. a–d Predicted versus reference energy results for polar-polar dimers. The short-range model is compared with three long- range variants: (b) eSCN+EwaldMP (scalar) [35], (c) eSCN+LES [22], and (d) eSCN+EquiEwald. For energy prediction, the short-range model fits only short- distance configurations and fails to generalize beyond the cutoff, while all long-range variants recover the full interaction curve, including distant test points. AIMD-Chig. Derived from ab initio molecular dynamics of the protein Chignolin, it poses challenges due to its flexible backbone, complex conformational dynamics, and long-range intra-molecular interactions important for maintaining structural stability. 7 Table 1: Performance comparison of different long-range interac- tion modeling approaches on the AIMD-Chig dataset. The base- line eSCN model is compared with variants incorporating scalar EwaldMP, and the proposed EquiEwald method. Mean Absolute Errors (MAE) for energy and atomic forces are reported in meV and meV/ ̊ A, respectively. ModelTest Energy (MAE) Test Force (MAE) eSCN [43]193.923.1 eSCN+EwaldMP (scalar) [35]132.820.3 eSCN+EquiEwald109.018.1 As shown in Table 1, the baseline eSCN model yields a test energy MAE of 193.9 meV and a force MAE of 23.1 meV/ ̊ A. With EquiEwald, these errors are reduced to 109.0 meV and 18.1 meV/ ̊ A, corresponding to improvements of 44% and 21%, respectively. By incorporating reciprocal-space information with angular resolution, EquiEwald provides the representational capacity needed to model collective motions and long- range electrostatics in biomolecular systems. Table 1 further includes scalar EwaldMP, enabling a systematic comparison between invariant and equivariant treatments of long-range interactions. We further evaluate the effect of explicitly modeling long-range interactions on thermodynamic properties using the fast-folding protein Chignolin. From the sim- ulation trajectories, 100,000 conformational snapshots were uniformly sampled and analyzed using the Q score, which measures the fraction of native contacts relative to the reference folded structure. The model-predicted potential energies were then used to reconstruct the energy landscape and estimate thermodynamic quantities, includ- ing the folding free-energy difference (∆G). Detailed protocols and analysis procedures are described in the Methods. Compared to the baseline eSCN model, which relies solely on local message passing, incorporating the EquiEwald reciprocal-space mod- ule substantially improves the accuracy of energy predictions relevant to free-energy estimation. Specifically, the prediction error is reduced from 1.15 kcal/mol for eSCN to 0.67 kcal/mol for eSCN+EquiEwald, corresponding to an approximate 42% rela- tive reduction. This improvement demonstrates that explicitly modeling long-range electrostatic interactions leads to more accurate thermodynamic estimates. In con- trast, purely local representations struggle to capture the collective intra-molecular interactions that stabilize folded conformations in Chignolin. Buckyball Catcher. The Buckyball Catcher system presents a strong challenge for interatomic potentials due to its supramolecular structure and long-range non-covalent interactions between the host and the encapsulated fullerene. These interactions exceed typical cutoff distances, making them difficult for local models to capture. As shown in Supplementary Table S1, the baseline eSCN model yields a test energy MAE of 36.0 meV, while incorporating EquiEwald reduces this to 18.1 meV, corresponding to an improvement of nearly 50%. For force prediction, the MAE drops from 6.4 meV/ ̊ A 8 Table 2: Mean absolute errors (MAE) of energy and force. Results are reported on the evaluation splits for OC20. Energy errors are reported in meV, and force errors are in meV/ ̊ A. Our method is highlighted in bold. ModelTest(Evaluation) Energy Test(Evaluation) Force eSCN [43]347.024.7 eSCN+EquiEwald321.224.1 EquiformerV2 [44]541.046.4 EquiformerV2+EquiEwald453.038.4 to 6.1 meV/ ̊ A. This larger energy gain arises because reciprocal-space methods effi- ciently capture the global modes governing the potential energy surface, whereas local force gradients remain more sensitive to high-frequency spectral truncation. These improvements are consistently observed across multiple training runs, as evidenced by the error bars in Supplementary Fig. S2, which demonstrate the statistical sta- bility of our method. These results demonstrate that EquiEwald enables the model to capture extended spatial correlations through reciprocal-space message passing with high harmonic spherical degrees, improving accuracy in systems with long-range supramolecular interactions. OC20. We further evaluate EquiEwald on the OC20 S2EF benchmark, which com- prises catalytic surface–adsorbate structures under periodic boundary conditions. Because OC20 contains heterogeneous interfaces, the predicted energies and forces can be affected by nonlocal effects, including charge redistribution, surface polarization, and interactions induced by periodicity. This property makes OC20 a relevant bench- mark for evaluating long-range interaction modeling. As shown in Table 2, EquiEwald consistently improves the performance of both backbones. The improvement is partic- ularly pronounced for EquiformerV2, for which the energy and force MAEs decrease from 541.0 to 453.0 meV and from 46.4 to 38.4 meV/ ̊ A, respectively. For eSCN, EquiEwald also reduces the energy MAE from 347.0 to 321.2 meV and the force MAE from 24.7 to 24.1 meV/ ̊ A. Overall, these results indicate that reciprocal-space long- range modeling is beneficial for periodic interfacial systems, with a more evident effect on energy prediction than on force prediction. 3 Discussion In this work, we introduced EquiEwald, an SO(3)-equivariant neural interatomic potential that integrates long-range interactions directly into the representation space of message-passing models. By embedding Ewald summation within a reciprocal- space framework and performing degree-resolved equivariant convolutions over high harmonic spherical components, EquiEwald captures multipolar and anisotropic cor- relations that are inaccessible to purely local methods. Our results across diverse benchmarks demonstrate the effectiveness and generality of this approach. In peri- odic systems, supramolecular assemblies, conjugated molecules, charged dimers, and 9 biomolecular dynamics, EquiEwald consistently improves energy and force predictions, even under limited data. These improvements reflect the model’s ability to resolve long-range electronic, electrostatic, and structural dependencies in a physically con- sistent manner. Rather than treating long-range interactions as external corrections, EquiEwald unifies short-range chemistry and long-range physics within a single, differ- entiable neural potential. This reframes the challenge of long-range modeling in MLIPs as a representational problem, and highlights the importance of embedding physical structure into the architecture itself, rather than deferring it to post hoc corrections or auxiliary terms. By bridging quantum accuracy and force-field scalability, EquiEwald offers a promising direction for next-generation MLIPs applicable to realistic systems such as electrolytes, molecular crystals, interfaces, and proteins. Its reciprocal-space formula- tion may also be extended to incorporate dielectric response, long-range polarization, or time-dependent interactions in future work. More broadly, it suggests a general strategy for integrating global physical priors into geometric deep learning models for scientific applications. 4 Methods 4.1 Model implementations Short-range Encoders. To capture local geometric information, we employ equiv- ariant neural networks that process atomic structures while maintaining physical symmetries. We initialize the initial atom features x 0 i via an embedding of the atomic number z i . To represent the directional nature of the local relation, these encoders utilize spherical harmonics to expand the relative displacement vectors between neigh- boring atoms. Here, the spherical harmonic degree ℓ determines the resolution of the angular information, where higher degrees capture more complex multi-body corre- lations, and the magnetic component m (ranging from −ℓ to ℓ) tracks the specific orientation of these features under rotation. Through equivariant operations, the encoder iteratively updates atom features to produce the final local atom representa- tion, denoted as x local i . Specific architectural details (eSCN [43] and EquiformerV2 [44]) are provided in the Supplementary Section 3. Long-range Encoders. The EquiEwald block is designed to capture non-local directional interactions by extending the Ewald message passing [35] framework to high-degree equivariant representations. While the original Ewald MP primarily com- municates isotropic scalar information, our high-degree approach enables the model to learn anisotropic long-range correlations and complex multi-body geometric patterns that extend across the entire molecular system. By incorporating tensorial features (ℓ > 0), the block captures how the specific orientations and angular distributions of distant atomic environments mutually influence the potential energy surface, effec- tively modeling long-range directional dependencies that are lost in local or scalar-only paradigms. The core concept of EquiEwald involves flipping the traditional rationale behind Ewald summation, rather than starting with a known physical kernel and seeking a 10 decomposition, we parametrize a learnable filter that is specifically not short-ranged. By learning this component directly in Fourier space, we replace the conventional spatial distance limit with a cutoff on frequency, enabling the efficient capture of global structural information through reciprocal-space message passing. To process these high-degree representations, EquiEwald introduces degree-resolved message passing in reciprocal space. For each spherical harmonic degree ℓ ∈ 0, 1,...,ℓ max , we first compute structure factor embeddings s (ℓ) α,m via a forward Fourier transform: s (ℓ) α,m = X j∈S x (ℓ) j,m exp − i k α · r j D(k α , r j ),(2) where r j denotes the position of atom j, k α indexes the reciprocal-space sampling points, and x (ℓ) j,m is the m-th magnetic component of the degree-ℓ atomic feature. The scalar factor D(k, r) denotes an accumulation window that weights each phase term during reciprocal-space summation; in our implementation this window is applied sym- metrically in both the forward and inverse accumulation. In periodic systems we set D(k, r) = 1 since structure factors are evaluated directly on the reciprocal lattice. In aperiodic systems, where reciprocal space is discretized on a Cartesian voxel grid with spacing ∆k, we use a voxel-averaging window to account for finite k-space resolution. To maintain SO(3) equivariance throughout the global update, a learnable filter F(k α ) is applied identically to all magnetic components m within a specific degree ℓ before the inverse transform: M (ℓ) m (r i ) = N k X α=1 exp i k α · r i F(k α )s (ℓ) α,m .(3) By constraining F(k α ) to mix the channel dimensions, EquiEwald ensures the geo- metric integrity of the tensorial features is preserved. We use F(k α ) as a unified notation: in periodic systems it reduces to a shared, k-independent channel mixer F ∈ R C×C , whereas in aperiodic systems it denotes a k-dependent diagonal spectral gate F(k α ) = diag(f α ) parameterized by the radial embedding ψ(∥k α ∥). The resulting reciprocal-space message M (ℓ) m (r i ) provides each atom with a comprehensive global update, complementing the local representations x local i to capture the full spectrum of interatomic interactions. Periodic systems. For periodic systems, reciprocal vectors are sampled on the reciprocal lattice induced by the simulation cell. Let (a 1 , a 2 , a 3 ) ∈ R 3 be the direct lattice basis and (b 1 , b 2 , b 3 ) the reciprocal basis satisfying b i ·a j = 2πδ ij , i.e., with V = a 1 ·(a 2 × a 3 ), b 1 = 2π V (a 2 × a 3 ), b 2 = 2π V (a 3 × a 1 ), b 3 = 2π V (a 1 × a 2 ).(4) Following Ewald message passing, we truncate reciprocal sampling by an index box rather than a direct radial cutoff to keep the number of k-points fixed across structures. 11 Specifically, we consider integer triples (n x ,n y ,n z )∈I =−N x ,...,N x ×−N y ,...,N y ×−N z ,...,N z ,(5) and form reciprocal vectors k n x ,n y ,n z = n x b 1 + n y b 2 + n z b 3 .(6) We then enumerate the resulting set k n x ,n y ,n z : (n x ,n y ,n z ) ∈ I and relabel it as k α N k α=1 for convenient summation, where each α corresponds to one triple in I. Since k α N k α=1 are reciprocal-lattice vectors computed analytically from the unit- cell geometry, the Fourier accumulation is performed directly on this lattice without any interpolation. Accordingly, no additional sampling window is required and we set D(k, r) = 1. Moreover, under a global rotation R ∈ SO(3), the crystal lattice co- rotates: a i 7→ Ra i implies b i 7→ Rb i and hence k α 7→ Rk α . Together with r j 7→ Rr j , this preserves the phase (Rk α )· (Rr j ) = k α ·r j , so the structure-factor computation is SO(3)-equivariant under global rotations. After forward accumulation, we apply a learnable spectral filter to the structure factors in reciprocal space. In our implementation, the linear reciprocal-space filter is shared across all degrees ℓ. Concretely, for each k α we use a single channel mixer es (ℓ) α,m = Fs (ℓ) α,m ,F∈ R C×C ,(7) where the same channel mixer F is shared across degrees ℓ. Crucially, for each fixed ℓ, F is applied identically to all magnetic components m = −ℓ,...,ℓ (i.e., it mixes channels only), thereby preserving SO(3)-equivariance. We parameterize F in a low-rank (bottleneck) form F = W up W down ,W down ∈ R C ↓ ×C , W up ∈ R C×C ↓ .(8) Here W down ∈ R C ↓ ×C performs a down-projection from C channels to a compact latent dimension C ↓ , and W up ∈ R C×C ↓ performs the corresponding up-projection back to C channels. This low-rank form reduces parameters and computation relative to a full C×C matrix while retaining an expressive linear spectral response. In imple- mentation, W down and W up correspond to the weights of the down and up-projection linear layers, and their product is reused for all degrees ℓ, whereas subsequent nonlinear updates are applied degree-wise. Aperiodic systems. For aperiodic systems, reciprocal sampling is performed on a fixed Cartesian voxel grid instead of a cell induced reciprocal lattice. Given a frequency resolution ∆k and cutoff k max , we enumerate integer triples (n x ,n y ,n z )∈I box =−N k ,...,N k 3 , N k = k max ∆k ,(9) k n x ,n y ,n z = ∆k (n x ,n y ,n z ),(10) 12 and retain only points inside the spherical cutoff ∥k∥≤ k max . We denote the retained set by k α N k α=1 . Unlike the periodic branch, we use a separable voxel window D(k, r) = Y d∈x,y,z sinc ∆k r d 2 ,(11) which reduces discretization artifacts from finite ∆k. In practice, coordinates are first centered and mapped to an internal SVD frame before this damping is applied, improving numerical stability across differently oriented structures. Instead of a sin- gle k-independent mixer, each reciprocal point uses a radial embedding ψ(∥k α ∥) and a bottleneck projection: f α = W up ( W down ψ(∥k α ∥) ) ∈ R C .(12) where ψ(∥k α ∥) ∈ R C ψ is a fixed radial embedding, W down ∈ R C ↓ ×C ψ and W up ∈ R C×C ↓ are learnable projections. Equivalently, this corresponds to using a diagonal channel mixer F(k α ) = diag(f α ), a channel-wise spectral gate at each reciprocal point. Here, f α is applied channel-wise at k α , giving a resolution-aware spectral response that varies with frequency magnitude. Since ψ(∥k α ∥) depends only on the precomputed grid, these radial features can be prepared once and reused across structures, reducing runtime overhead. Inverse accumulation. After reciprocal-space filtering, the long-range update is obtained in one step by degree-wise inverse accumulation, degree-specific nonlinear refinement, and scalar-conditioned gating: x Ewald i = Gate W g h i,ℓ=0 , ℓ max M ℓ=0 MLP (ℓ) η [M (ℓ) m (r i )] ℓ m=−ℓ ! , (13) where h i,ℓ=0 ∈ R C denotes the scalar (ℓ = 0) block of the current atom representation and g i = W g h i,ℓ=0 ∈ R ℓ max C , W g ∈ R (ℓ max C)×C are the gating coefficients. In our implementation, g i provides one C-dimensional gate per degree ℓ = 1,...,ℓ max , and the corresponding gate is broadcast to all magnetic components m =−ℓ,...,ℓ within that degree. Moreover, M (ℓ) (r i )∈ R (2ℓ+1)×C is the degree-ℓ inverse-accumulated mes- sage at atom i, and MLP (ℓ) : R (2ℓ+1)×C → R (2ℓ+1)×C is the degree-specific nonlinear update network that acts on channels and is shared across m. Finally, Gate(g i ,·) denotes scalar-conditioned gating that modulates the non-scalar (ℓ > 0) blocks using g i . Information fusion. At interaction layer t, we fuse the running representation, the local update, and the long-range update via x t+1 i = 1 √ 3 x t i + x Local,t i + x Ewald,t i .(14) 13 Here, t denotes the message-passing layer index. For brevity, earlier equations omit the layer superscript; all feature tensors can be interpreted as layer-dependent when this does not cause ambiguity. 4.2 Free-energy calculation To characterize folding thermodynamics, we utilized the extensively sampled simula- tion trajectories from [48] (the same dataset as in [49]), comprising 100,000 snapshots for Chignolin. Snapshots were classified into folded and unfolded states using the native contact fraction Q with thresholds of Q > 0.82 for folded and Q < 0.03 for unfolded states [50]. For each snapshot, the potential energy was re-evaluated using both eSCN and eSCN+EquiEwald potentials. State probabilities were then obtained by Boltzmann reweighting over the sampled conformational ensemble at temperature T , P (s)∝ X i∈s exp ( −βE i ) ,(15) where E i denotes the re-evaluated potential energy of snapshot i, β = (k B T ) −1 , and s indicates either the folded or unfolded state. The free-energy difference between the two states was computed as ∆G =−k B T ln P folded P unfolded .(16) This approach yields free-energy estimates directly from equilibrium ensembles without invoking additional biasing potentials or alchemical transformations. 4.3 Training settings The model is optimized using a composite loss with energy E and atom-wise forces F i : L = λ E ∥E pred − E ref ∥ 1 + λ F 1 3N N X i=1 ∥F pred i − F ref i ∥ 1 ,(17) where∥·∥ 1 denotes the mean absolute error (MAE), E pred and E ref are the predicted and reference total energies per configuration, F pred i , F ref i ∈ R 3 are the predicted and reference forces on atom i, N is the number of atoms in the configuration, and the factor 1/(3N ) averages the force MAE over all 3N Cartesian components. λ E ,λ F > 0 are scalar weights balancing the two objectives. The training settings on the OC20 periodic systems dataset are as follows: a cutoff radius of 6.0 ̊ A is used to construct local neighborhood graphs. For reciprocal space settings, we followed [35]. In reciprocal space, the number of included frequencies along the three lattice directions is adjusted to account for the average anisotropy of the unit cell. Observing that the average reciprocal-space unit cell on OC20 is approximately three times narrower along the surface normal (z-direction) than along the x and y directions, we set 1, 1, and 3 for each direction. In the loss function, the weighting 14 coefficients for energy and forces are set to λ E = 1 and λ F = 100. For aperiodic systems, including the supramolecular buckyball catcher and the charged dimer, the following settings are employed. The spherical harmonic degree is set to ℓ max = 3, and the Ewald-based long-range module uses a reciprocal-space cutoff of 0.6 ̊ A -1 , a grid spacing of 0.2 ̊ A -1 , and 128-dimensional Gaussian radial-basis functions to parameterize the frequency-domain filters. The loss-function weighting between energy and forces is fixed at λ E = 1 and λ F = 100. Additional experimental hyperparameters are provided in Supplementary Tables S3–S6. Data availability The datasets used in this work are available at Code Ocean. Code availability The source code for reproducing the findings in this paper is available at Code Ocean. It is licensed under the Apache License 2.0, which allows users to use, modify, and distribute the code freely, provided that proper attribution is given to the original authors. This open source approach improves the reproducibility of our results and facilitates further research in this area. References [1] Hospital, A., Go ̃ni, J.R., Orozco, M., Gelp ́ı, J.L.: Molecular dynamics simula- tions: advances and applications. Advances and Applications in Bioinformatics and Chemistry, 37–47 (2015) [2] Senftle, T.P., Hong, S., Islam, M.M., Kylasa, S.B., Zheng, Y., Shin, Y.K., Junkermeier, C., Engel-Herbert, R., Janik, M.J., Aktulga, H.M., et al.: The reaxff reactive force-field: development, applications and future directions. npj Computational Materials 2(1), 1–14 (2016) [3] Karplus, M., Petsko, G.A.: Molecular dynamics simulations in biology. Nature 347(6294), 631–639 (1990) [4] Yao, N., Chen, X., Fu, Z.-H., Zhang, Q.: Applying classical, ab initio, and machine-learning molecular dynamics simulations to the liquid electrolyte for rechargeable batteries. Chemical Reviews 122(12), 10970–11021 (2022) [5] Zitnick, L., Das, A., Kolluru, A., Lan, J., Shuaibi, M., Sriram, A., Ulissi, Z., Wood, B.: Spherical channels for modeling atomic interactions. Advances in Neural Information Processing Systems 35, 8054–8067 (2022) [6] Geerlings, P., De Proft, F., Langenaeker, W.: Conceptual density functional theory. Chemical reviews 103(5), 1793–1874 (2003) 15 [7] Deringer, V.L., Bart ́ok, A.P., Bernstein, N., Wilkins, D.M., Ceriotti, M., Cs ́anyi, G.: Gaussian process regression for materials and molecules. Chemical reviews 121(16), 10073–10141 (2021) [8] Ramprasad, R., Batra, R., Pilania, G., Mannodi-Kanakkithodi, A., Kim, C.: Machine learning in materials informatics: recent applications and prospects. npj Computational Materials 3(1), 54 (2017) [9] Butler, K.T., Davies, D.W., Cartwright, H., Isayev, O., Walsh, A.: Machine learning for molecular and materials science. Nature 559(7715), 547–555 (2018) [10] Gubernatis, J., Lookman, T.: Machine learning in materials design and discov- ery: Examples from the present and suggestions for the future. Physical Review Materials 2(12), 120301 (2018) [11] Unke, O.T., Chmiela, S., Sauceda, H.E., Gastegger, M., Poltavsky, I., Schutt, K.T., Tkatchenko, A., Muller, K.-R.: Machine learning force fields. Chemical Reviews 121(16), 10142–10186 (2021) [12] Carleo, G., Cirac, I., Cranmer, K., Daudet, L., Schuld, M., Tishby, N., Vogt- Maranto, L., Zdeborov ́a, L.: Machine learning and the physical sciences. Reviews of Modern Physics 91(4), 045002 (2019) [13] Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) [14] Musaelian, A., Batzner, S., Johansson, A., Sun, L., Owen, C.J., Kornbluth, M., Kozinsky, B.: Learning local equivariant representations for large-scale atomistic dynamics. Nature Communications 14(1), 579 (2023) [15] Batatia, I., Kovacs, D.P., Simm, G., Ortner, C., Cs ́anyi, G.: Mace: Higher order equivariant message passing neural networks for fast and accurate force fields. Advances in neural information processing systems 35, 11423–11436 (2022) [16] Anstine, D.M., Isayev, O.: Machine learning interatomic potentials and long-range physics. The Journal of Physical Chemistry A 127(11), 2417–2431 (2023) [17] Sch ̈utt, K.T., Sauceda, H.E., Kindermans, P.-J., Tkatchenko, A., M ̈uller, K.-R.: Schnet–a deep learning architecture for molecules and materials. The Journal of chemical physics 148(24) (2018) [18] Gasteiger, J., Groß, J., G ̈unnemann, S.: Directional message passing for molecular graphs. arXiv preprint arXiv:2003.03123 (2020) [19] Levin, Y.: Polarizable ions at interfaces. Physical review letters 102(14), 147803 16 (2009) [20] Bedrov, D., Piquemal, J.-P., Borodin, O., MacKerell Jr, A.D., Roux, B., Schr”o”der, C.: Molecular dynamics simulations of ionic liquids and electrolytes using polarizable force fields. Chemical reviews 119(13), 7940–7995 (2019) [21] Borodin, O.: Polarizable force field development and molecular dynamics simula- tions of ionic liquids. The Journal of Physical Chemistry B 113(33), 11463–11478 (2009) [22] Cheng, B.: Latent ewald summation for machine learning of long-range interac- tions. npj Computational Materials 11(1), 80 (2025) [23] Gao, R., Yam, C., Mao, J., Chen, S., Chen, G., Hu, Z.: A foundation machine learning potential with polarizable long-range interactions for materials mod- elling. Nature Communications 16(1), 10484 (2025) [24] Huguenin-Dumittan, K.K., Loche, P., Haoran, N., Ceriotti, M.: Physics-inspired equivariant descriptors of nonbonded interactions. The Journal of Physical Chemistry Letters 14(43), 9612–9618 (2023) [25] Niblett, S.P., Galib, M., Limmer, D.T.: Learning intermolecular forces at liquid– vapor interfaces. The Journal of chemical physics 155(16) (2021) [26] Unke, O.T., Chmiela, S., Gastegger, M., Sch ̈utt, K.T., Sauceda, H.E., M ̈uller, K.-R.: Spookynet: Learning force fields with electronic degrees of freedom and nonlocal effects. Nature communications 12(1), 7273 (2021) [27] Unke, O.T., Meuwly, M.: Physnet: A neural network for predicting energies, forces, dipole moments, and partial charges. Journal of chemical theory and computation 15(6), 3678–3693 (2019) [28] Ko, T.W., Finkler, J.A., Goedecker, S., Behler, J.: A fourth-generation high- dimensional neural network potential with accurate electrostatics including non- local charge transfer. Nature communications 12(1), 398 (2021) [29] Gao, A., Remsing, R.C.: Self-consistent determination of long-range electrostatics in neural network potentials. Nature communications 13(1), 1572 (2022) [30] Shaidu, Y., Pellegrini, F., K ̈u ̧c ̈ukbenli, E., Lot, R., Gironcoli, S.: Incorporating long-range electrostatics in neural network potentials via variational charge equi- libration from shortsighted ingredients. npj Computational Materials 10(1), 47 (2024) [31] Zhang, L., Wang, H., Muniz, M.C., Panagiotopoulos, A.Z., Car, R., et al.: A deep potential model with long-range electrostatic interactions. The Journal of Chemical Physics 156(12) (2022) 17 [32] Sch ̈utt, K., Kindermans, P.-J., Sauceda Felix, H.E., Chmiela, S., Tkatchenko, A., M ̈uller, K.-R.: Schnet: A continuous-filter convolutional neural network for mod- eling quantum interactions. Advances in neural information processing systems 30 (2017) [33] Deng, B., Zhong, P., Jun, K., Riebesell, J., Han, K., Bartel, C.J., Ceder, G.: Chgnet as a pretrained universal neural network potential for charge-informed atomistic modelling. Nature Machine Intelligence 5(9), 1031–1041 (2023) [34] Yu, H., Hong, L., Chen, S., Gong, X., Xiang, H.: Capturing long-range interaction with reciprocal space neural network. arXiv preprint arXiv:2211.16684 (2022) [35] Kosmala, A., Gasteiger, J., Gao, N., G ̈unnemann, S.: Ewald-based long-range message passing for molecular graphs. In: International Conference on Machine Learning, p. 17544–17563 (2023). PMLR [36] King, D.S., Kim, D., Zhong, P., Cheng, B.: Machine learning of charges and long- range interactions from energies and forces. Nature Communications 16(1), 8763 (2025) [37] Grisafi, A., Ceriotti, M.: Incorporating long-range physics in atomic-scale machine learning. The Journal of chemical physics 151(20) (2019) [38] Duval, A., Mathis, S.V., Joshi, C.K., Schmidt, V., Miret, S., Malliaros, F.D., Cohen, T., Lio, P., Bengio, Y., Bronstein, M.: A hitchhiker’s guide to geometric gnns for 3d atomic systems. arXiv preprint arXiv:2312.07511 (2023) [39] Thomas, N., Smidt, T., Kearnes, S., Yang, L., Li, L., Kohlhoff, K., Riley, P.: Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219 (2018) [40] Geiger, M., Smidt, T.: e3n: Euclidean neural networks. arXiv preprint arXiv:2207.09453 (2022) [41] Weiler, M., Geiger, M., Welling, M., Boomsma, W., Cohen, T.S.: 3d steerable cnns: Learning rotationally equivariant features in volumetric data. Advances in Neural information processing systems 31 (2018) [42] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Sch ̈utt, K.T., M ̈uller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) [43] Passaro, S., Zitnick, C.L.: Reducing so (3) convolutions to so (2) for efficient equivariant gnns. In: International Conference on Machine Learning, p. 27420– 27438 (2023). PMLR 18 [44] Liao, Y.-L., Wood, B., Das, A., Smidt, T.: Equiformerv2: Improved equiv- ariant transformer for scaling to higher-degree representations. arXiv preprint arXiv:2306.12059 (2023) [45] Wang, T., He, X., Li, M., Shao, B., Liu, T.-Y.: Aimd-chig: Exploring the confor- mational space of a 166-atom protein chignolin with ab initio molecular dynamics. Scientific Data 10(1), 549 (2023) [46] Chmiela, S., Vassilev-Galindo, V., Unke, O.T., Kabylda, A., Sauceda, H.E., Tkatchenko, A., M ̈uller, K.-R.: Accurate global machine learning force fields for molecules with hundreds of atoms. Science Advances 9(2), 0873 (2023) [47] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) [48] Lindorff-Larsen, K., Piana, S., Dror, R.O., Shaw, D.E.: How fast-folding proteins fold. Science 334(6055), 517–520 (2011) [49] Wang, T., He, X., Li, M., Li, Y., Bi, R., Wang, Y., Cheng, C., Shen, X., Meng, J., Zhang, H., et al.: Ab initio characterization of protein molecular dynamics with ai2bmd. Nature 635(8040), 1019–1027 (2024) [50] Best, R.B., Hummer, G., Eaton, W.A.: Native contacts determine protein folding mechanisms in atomistic simulations. Proceedings of the National Academy of Sciences 110(44), 17874–17879 (2013) 19