Machine Learning Discovery of Optimal Quadrature Rules for Isogeometric
Analysis
- URL: http://arxiv.org/abs/2304.01802v1
- Date: Tue, 4 Apr 2023 13:59:07 GMT
- Title: Machine Learning Discovery of Optimal Quadrature Rules for Isogeometric
Analysis
- Authors: Tomas Teijeiro, Jamie M. Taylor, Ali Hashemian, David Pardo
- Abstract summary: We propose the use of machine learning techniques to find optimal quadrature rules in isogeometric analysis.
We find optimal quadrature rules for spline spaces when using IGA discretizations with up to 50 uniform elements and degrees up to 8.
- Score: 0.5161531917413708
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We propose the use of machine learning techniques to find optimal quadrature
rules for the construction of stiffness and mass matrices in isogeometric
analysis (IGA). We initially consider 1D spline spaces of arbitrary degree
spanned over uniform and non-uniform knot sequences, and then the generated
optimal rules are used for integration over higher-dimensional spaces using
tensor product sense. The quadrature rule search is posed as an optimization
problem and solved by a machine learning strategy based on gradient-descent.
However, since the optimization space is highly non-convex, the success of the
search strongly depends on the number of quadrature points and the parameter
initialization. Thus, we use a dynamic programming strategy that initializes
the parameters from the optimal solution over the spline space with a lower
number of knots. With this method, we found optimal quadrature rules for spline
spaces when using IGA discretizations with up to 50 uniform elements and
polynomial degrees up to 8, showing the generality of the approach in this
scenario. For non-uniform partitions, the method also finds an optimal rule in
a reasonable number of test cases. We also assess the generated optimal rules
in two practical case studies, namely, the eigenvalue problem of the Laplace
operator and the eigenfrequency analysis of freeform curved beams, where the
latter problem shows the applicability of the method to curved geometries. In
particular, the proposed method results in savings with respect to traditional
Gaussian integration of up to 44% in 1D, 68% in 2D, and 82% in 3D spaces.
Related papers
- Differentially Private Optimization with Sparse Gradients [60.853074897282625]
We study differentially private (DP) optimization problems under sparsity of individual gradients.
Building on this, we obtain pure- and approximate-DP algorithms with almost optimal rates for convex optimization with sparse gradients.
arXiv Detail & Related papers (2024-04-16T20:01:10Z) - Fast Screening Rules for Optimal Design via Quadratic Lasso
Reformulation [0.135975510645475]
In this work, we derive safe screening rules that can be used to discard inessential samples.
The new tests are much faster to compute, especially for problems involving a parameter space of high dimension.
We show how an existing homotopy algorithm to compute the regularization path of the lasso method can be reparametrized with respect to the squared $ell_$-penalty.
arXiv Detail & Related papers (2023-10-13T08:10:46Z) - Generative Models for Anomaly Detection and Design-Space Dimensionality
Reduction in Shape Optimization [0.0]
Our work presents a novel approach to shape optimization, with the twofold objective to improve the efficiency of global algorithms and to promote the generation of high-quality designs.
This is accomplished by reducing the number of the original design variables defining a new reduced subspace where the geometrical variance is maximized.
From the numerical results, the new framework improves the convergence of global optimization algorithms, while only designs with high-quality geometrical features are generated.
arXiv Detail & Related papers (2023-08-08T04:57:58Z) - Global optimization of MPS in quantum-inspired numerical analysis [0.0]
The study focuses on the search for the lowest eigenstates of a Hamiltonian equation.
Five algorithms are introduced: imaginary-time evolution, steepest gradient descent, an improved descent, an implicitly restarted Arnoldi method, and density matrix renormalization group (DMRG) optimization.
arXiv Detail & Related papers (2023-03-16T16:03:51Z) - Mean-Square Analysis with An Application to Optimal Dimension Dependence
of Langevin Monte Carlo [60.785586069299356]
This work provides a general framework for the non-asymotic analysis of sampling error in 2-Wasserstein distance.
Our theoretical analysis is further validated by numerical experiments.
arXiv Detail & Related papers (2021-09-08T18:00:05Z) - An Operator-Splitting Method for the Gaussian Curvature Regularization
Model with Applications in Surface Smoothing and Imaging [6.860238280163609]
We propose an operator-splitting method for a general Gaussian curvature model.
The proposed method is not sensitive to the choice of parameters, its efficiency and performances being demonstrated.
arXiv Detail & Related papers (2021-08-04T08:59:41Z) - Optimal oracle inequalities for solving projected fixed-point equations [53.31620399640334]
We study methods that use a collection of random observations to compute approximate solutions by searching over a known low-dimensional subspace of the Hilbert space.
We show how our results precisely characterize the error of a class of temporal difference learning methods for the policy evaluation problem with linear function approximation.
arXiv Detail & Related papers (2020-12-09T20:19:32Z) - Convergence of adaptive algorithms for weakly convex constrained
optimization [59.36386973876765]
We prove the $mathcaltilde O(t-1/4)$ rate of convergence for the norm of the gradient of Moreau envelope.
Our analysis works with mini-batch size of $1$, constant first and second order moment parameters, and possibly smooth optimization domains.
arXiv Detail & Related papers (2020-06-11T17:43:19Z) - Effective Dimension Adaptive Sketching Methods for Faster Regularized
Least-Squares Optimization [56.05635751529922]
We propose a new randomized algorithm for solving L2-regularized least-squares problems based on sketching.
We consider two of the most popular random embeddings, namely, Gaussian embeddings and the Subsampled Randomized Hadamard Transform (SRHT)
arXiv Detail & Related papers (2020-06-10T15:00:09Z) - A Riemannian Primal-dual Algorithm Based on Proximal Operator and its
Application in Metric Learning [3.511851311025242]
We propose a primal-dual algorithm to optimize the primal and dual variables iteratively.
We prove convergence of the proposed algorithm and show its non-asymptotic convergence rate.
Preliminary experimental results on an optimal fund selection problem in fund of funds management showed its efficacy.
arXiv Detail & Related papers (2020-05-19T03:31:01Z) - Optimal Randomized First-Order Methods for Least-Squares Problems [56.05635751529922]
This class of algorithms encompasses several randomized methods among the fastest solvers for least-squares problems.
We focus on two classical embeddings, namely, Gaussian projections and subsampled Hadamard transforms.
Our resulting algorithm yields the best complexity known for solving least-squares problems with no condition number dependence.
arXiv Detail & Related papers (2020-02-21T17:45:32Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.