New Optimizer Enhances Neural Network Wavefunction Efficiency
Recent advancements in computational physics have introduced a new optimizer designed to enhance the efficiency of neural network wavefunction optimization. The paper titled "A Kaczmarz-inspired approach to accelerate the optimization of neural network wavefunctions" by Gil Goldshlager, Nilin Abrahamsen, and Lin Lin proposes the Subsampled Projected-Increment Natural Gradient Descent (SPRING) optimizer. This method aims to address the high computational costs associated with optimizing neural network wavefunctions, which are crucial for accurately modeling the electronic structure of atoms and small molecules.
The SPRING optimizer integrates concepts from the minimum-step stochastic reconfiguration optimizer (MinSR) and the classical randomized Kaczmarz method, which is traditionally used for solving linear least-squares problems. The authors demonstrate that SPRING significantly outperforms both MinSR and the widely used Kronecker-Factored Approximate Curvature method (KFAC) when optimally tuned learning rates are applied. For instance, in tests involving the oxygen atom, SPRING achieved chemical accuracy after just forty thousand training iterations, while MinSR and KFAC did not reach this level of accuracy even after one hundred thousand iterations.
This development is particularly relevant as it opens up the possibility for applying neural network wavefunctions to larger systems, which was previously hindered by computational limitations. The findings suggest that SPRING could facilitate more extensive and accurate simulations in quantum chemistry and materials science, potentially leading to breakthroughs in these fields. The paper is available for further reading at arXiv:2401.10190.