Trapped Fermions Through Kolmogorov-Arnold Wavefunctions
- URL: http://arxiv.org/abs/2512.07800v1
- Date: Mon, 08 Dec 2025 18:30:30 GMT
- Title: Trapped Fermions Through Kolmogorov-Arnold Wavefunctions
- Authors: Paulo F. Bedaque, Jacob Cigliano, Hersh Kumar, Srijit Paul, Suryansh Rajawat,
- Abstract summary: We investigate a variational Monte Carlo framework for trapped one-dimensional mixture of spin-$frac12$ fermions.<n>We construct universal neural-network wavefunction anstze using Kolmogorov-Arnold networks.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We investigate a variational Monte Carlo framework for trapped one-dimensional mixture of spin-$\frac{1}{2}$ fermions using Kolmogorov-Arnold networks (KANs) to construct universal neural-network wavefunction ansätze. The method can, in principle, achieve arbitrary accuracy, limited only by the Monte Carlo sampling and was checked against exact results at sub-percent precision. For attractive interactions, it captures pairing effects, and in the impurity case it agrees with known results. We present a method of systematic transfer learning in the number of network parameters, allowing for efficient training for a target precision. We vastly increase the efficiency of the method by incorporating the short-distance behavior of the wavefunction into the ansätz without biasing the method.
Related papers
- Semi-Implicit Functional Gradient Flow for Efficient Sampling [30.32233517392456]
We propose a functional gradient ParVI method that uses perturbed particles with Gaussian noise as the approximation family.<n>We show that the corresponding functional gradient flow, which can be estimated via denoising score matching with neural networks, exhibits strong theoretical convergence guarantees.<n>In addition, we present an adaptive version of our method that automatically selects the appropriate noise magnitude during sampling.
arXiv Detail & Related papers (2024-10-23T15:00:30Z) - Variance-Reducing Couplings for Random Features [57.73648780299374]
Random features (RFs) are a popular technique to scale up kernel methods in machine learning.
We find couplings to improve RFs defined on both Euclidean and discrete input spaces.
We reach surprising conclusions about the benefits and limitations of variance reduction as a paradigm.
arXiv Detail & Related papers (2024-05-26T12:25:09Z) - Low-rank extended Kalman filtering for online learning of neural
networks from streaming data [71.97861600347959]
We propose an efficient online approximate Bayesian inference algorithm for estimating the parameters of a nonlinear function from a potentially non-stationary data stream.
The method is based on the extended Kalman filter (EKF), but uses a novel low-rank plus diagonal decomposition of the posterior matrix.
In contrast to methods based on variational inference, our method is fully deterministic, and does not require step-size tuning.
arXiv Detail & Related papers (2023-05-31T03:48:49Z) - A Score-Based Model for Learning Neural Wavefunctions [41.82403146569561]
We provide a new framework for obtaining properties of quantum many-body ground states using score-based neural networks.
Our new framework does not require explicit probability distribution and performs the sampling via Langevin dynamics.
arXiv Detail & Related papers (2023-05-25T23:44:27Z) - Neural Wave Functions for Superfluids [3.440236962613469]
We study the unitary Fermi gas, a system with strong, short-range, two-body interactions known to possess a superfluid ground state.
We use the recently developed Fermionic neural network (FermiNet) wave function Ansatz for variational Monte Carlo calculations.
arXiv Detail & Related papers (2023-05-11T17:23:29Z) - Monte Carlo Neural PDE Solver for Learning PDEs via Probabilistic Representation [59.45669299295436]
We propose a Monte Carlo PDE solver for training unsupervised neural solvers.<n>We use the PDEs' probabilistic representation, which regards macroscopic phenomena as ensembles of random particles.<n>Our experiments on convection-diffusion, Allen-Cahn, and Navier-Stokes equations demonstrate significant improvements in accuracy and efficiency.
arXiv Detail & Related papers (2023-02-10T08:05:19Z) - Improving the performance of fermionic neural networks with the Slater
exponential Ansatz [0.351124620232225]
We propose a technique for the use of fermionic neural networks (FermiNets) with the Slater exponential Ansatz for electron-nuclear and electron-electron distances.
arXiv Detail & Related papers (2022-02-21T11:15:42Z) - Determinant-free fermionic wave function using feed-forward neural
networks [0.0]
We propose a framework for finding the ground state of many-body fermionic systems by using feed-forward neural networks.
We show that the accuracy of the approximation can be improved by optimizing the "variance" of the energy simultaneously with the energy itself.
These improvements can be applied to other approaches based on variational Monte Carlo methods.
arXiv Detail & Related papers (2021-08-19T11:51:36Z) - Sampling-free Variational Inference for Neural Networks with
Multiplicative Activation Noise [51.080620762639434]
We propose a more efficient parameterization of the posterior approximation for sampling-free variational inference.
Our approach yields competitive results for standard regression problems and scales well to large-scale image classification tasks.
arXiv Detail & Related papers (2021-03-15T16:16:18Z) - Sampling in Combinatorial Spaces with SurVAE Flow Augmented MCMC [83.48593305367523]
Hybrid Monte Carlo is a powerful Markov Chain Monte Carlo method for sampling from complex continuous distributions.
We introduce a new approach based on augmenting Monte Carlo methods with SurVAE Flows to sample from discrete distributions.
We demonstrate the efficacy of our algorithm on a range of examples from statistics, computational physics and machine learning, and observe improvements compared to alternative algorithms.
arXiv Detail & Related papers (2021-02-04T02:21:08Z) - Almost-Matching-Exactly for Treatment Effect Estimation under Network
Interference [73.23326654892963]
We propose a matching method that recovers direct treatment effects from randomized experiments where units are connected in an observed network.
Our method matches units almost exactly on counts of unique subgraphs within their neighborhood graphs.
arXiv Detail & Related papers (2020-03-02T15:21:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.