Convex Quaternion Optimization for Signal Processing: Theory and
Applications
- URL: http://arxiv.org/abs/2305.06879v1
- Date: Tue, 9 May 2023 16:11:17 GMT
- Title: Convex Quaternion Optimization for Signal Processing: Theory and
Applications
- Authors: Shuning Sun, Qiankun Diao, Dongpo Xu, Pauline Bourigault and Danilo P.
Mandic
- Abstract summary: We establish an essential theory of convex quaternion optimization for signal processing based on the generalized Hamilton-real calculus.
We present five discriminant theorems for convex quaternion functions, and four discriminant criteria for strongly convex quaternion functions.
- Score: 18.6716071499445
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Convex optimization methods have been extensively used in the fields of
communications and signal processing. However, the theory of quaternion
optimization is currently not as fully developed and systematic as that of
complex and real optimization. To this end, we establish an essential theory of
convex quaternion optimization for signal processing based on the generalized
Hamilton-real (GHR) calculus. This is achieved in a way which conforms with
traditional complex and real optimization theory. For rigorous, We present five
discriminant theorems for convex quaternion functions, and four discriminant
criteria for strongly convex quaternion functions. Furthermore, we provide a
fundamental theorem for the optimality of convex quaternion optimization
problems, and demonstrate its utility through three applications in quaternion
signal processing. These results provide a solid theoretical foundation for
convex quaternion optimization and open avenues for further developments in
signal processing applications.
Related papers
- Global Optimization of Gaussian Process Acquisition Functions Using a Piecewise-Linear Kernel Approximation [2.3342885570554652]
We introduce a piecewise approximation for process kernels and a corresponding MIQP representation for acquisition functions.
We empirically demonstrate the framework on synthetic functions, constrained benchmarks, and hyper tuning tasks.
arXiv Detail & Related papers (2024-10-22T10:56:52Z) - Distributed Markov Chain Monte Carlo Sampling based on the Alternating
Direction Method of Multipliers [143.6249073384419]
In this paper, we propose a distributed sampling scheme based on the alternating direction method of multipliers.
We provide both theoretical guarantees of our algorithm's convergence and experimental evidence of its superiority to the state-of-the-art.
In simulation, we deploy our algorithm on linear and logistic regression tasks and illustrate its fast convergence compared to existing gradient-based methods.
arXiv Detail & Related papers (2024-01-29T02:08:40Z) - The HR-Calculus: Enabling Information Processing with Quaternion Algebra [23.004932995116054]
quaternions and their division algebra have proven to be advantageous in modelling rotation/orientation in three-dimensional spaces.
adaptive information processing techniques specifically designed for quaternion-valued signals have only recently come to the attention of the machine learning, signal processing, and control communities.
arXiv Detail & Related papers (2023-11-28T13:25:34Z) - GloptiNets: Scalable Non-Convex Optimization with Certificates [61.50835040805378]
We present a novel approach to non-cube optimization with certificates, which handles smooth functions on the hypercube or on the torus.
By exploiting the regularity of the target function intrinsic in the decay of its spectrum, we allow at the same time to obtain precise certificates and leverage the advanced and powerful neural networks.
arXiv Detail & Related papers (2023-06-26T09:42:59Z) - Machine Learning Discovery of Optimal Quadrature Rules for Isogeometric
Analysis [0.5161531917413708]
We propose the use of machine learning techniques to find optimal quadrature rules in isogeometric analysis.
We find optimal quadrature rules for spline spaces when using IGA discretizations with up to 50 uniform elements and degrees up to 8.
arXiv Detail & Related papers (2023-04-04T13:59:07Z) - Extrinsic Bayesian Optimizations on Manifolds [1.3477333339913569]
We propose an extrinsic Bayesian optimization (eBO) framework for general optimization problems on Euclid manifold.
Our approach is to employ extrinsic Gaussian processes by first embedding the manifold onto some higher dimensionalean space.
This leads to efficient and scalable algorithms for optimization over complex manifold.
arXiv Detail & Related papers (2022-12-21T06:10:12Z) - Non-Convex Optimization with Certificates and Fast Rates Through Kernel
Sums of Squares [68.8204255655161]
We consider potentially non- optimized approximation problems.
In this paper, we propose an algorithm that achieves close to optimal a priori computational guarantees.
arXiv Detail & Related papers (2022-04-11T09:37:04Z) - Trilevel and Multilevel Optimization using Monotone Operator Theory [5.927983571004003]
We consider a trilevel optimization problem, where the objective of the two lower layers consists of a sum of a smooth and a non-smooth term.
We present a natural first-order algorithm and analyze its convergence and rates of convergence in several regimes of parameters.
arXiv Detail & Related papers (2021-05-19T21:31:18Z) - Zeroth-Order Hybrid Gradient Descent: Towards A Principled Black-Box
Optimization Framework [100.36569795440889]
This work is on the iteration of zero-th-order (ZO) optimization which does not require first-order information.
We show that with a graceful design in coordinate importance sampling, the proposed ZO optimization method is efficient both in terms of complexity as well as as function query cost.
arXiv Detail & Related papers (2020-12-21T17:29:58Z) - Recent Theoretical Advances in Non-Convex Optimization [56.88981258425256]
Motivated by recent increased interest in analysis of optimization algorithms for non- optimization in deep networks and other problems in data, we give an overview of recent results of theoretical optimization algorithms for non- optimization.
arXiv Detail & Related papers (2020-12-11T08:28:51Z) - A Primer on Zeroth-Order Optimization in Signal Processing and Machine
Learning [95.85269649177336]
ZO optimization iteratively performs three major steps: gradient estimation, descent direction, and solution update.
We demonstrate promising applications of ZO optimization, such as evaluating and generating explanations from black-box deep learning models, and efficient online sensor management.
arXiv Detail & Related papers (2020-06-11T06:50:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.