Learning Set Functions that are Sparse in Non-Orthogonal Fourier Bases
- URL: http://arxiv.org/abs/2010.00439v3
- Date: Mon, 29 Mar 2021 12:26:21 GMT
- Title: Learning Set Functions that are Sparse in Non-Orthogonal Fourier Bases
- Authors: Chris Wendler, Andisheh Amrollahi, Bastian Seifert, Andreas Krause,
Markus P\"uschel
- Abstract summary: We present a new family of algorithms for learning Fourier-sparse set functions.
In contrast to other work that focused on the Walsh-Hadamard transform, our novel algorithms operate with recently introduced non-orthogonal Fourier transforms.
We demonstrate effectiveness on several real-world applications.
- Score: 73.53227696624306
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Many applications of machine learning on discrete domains, such as learning
preference functions in recommender systems or auctions, can be reduced to
estimating a set function that is sparse in the Fourier domain. In this work,
we present a new family of algorithms for learning Fourier-sparse set
functions. They require at most $nk - k \log_2 k + k$ queries (set function
evaluations), under mild conditions on the Fourier coefficients, where $n$ is
the size of the ground set and $k$ the number of non-zero Fourier coefficients.
In contrast to other work that focused on the orthogonal Walsh-Hadamard
transform, our novel algorithms operate with recently introduced non-orthogonal
Fourier transforms that offer different notions of Fourier-sparsity. These
naturally arise when modeling, e.g., sets of items forming substitutes and
complements. We demonstrate effectiveness on several real-world applications.
Related papers
- Neural Fourier Modelling: A Highly Compact Approach to Time-Series Analysis [9.969451740838418]
We introduce Neural Fourier Modelling (NFM), a compact yet powerful solution for time-series analysis.
NFM is grounded in two key properties of the Fourier transform (FT): (i) the ability to model finite-length time series as functions in the Fourier domain, and (ii) the capacity for data manipulation within the Fourier domain.
NFM achieves state-of-the-art performance on a wide range of tasks, including challenging time-series scenarios with previously unseen sampling rates at test time.
arXiv Detail & Related papers (2024-10-07T02:39:55Z) - Trigonometric Quadrature Fourier Features for Scalable Gaussian Process
Regression [3.577968559443225]
Quadrature Fourier Features (QFF) have gained popularity in recent years due to their improved approximation accuracy and better calibrated uncertainty estimates.
A key limitation of QFF is that its performance can suffer from well-known pathologies related to highly oscillatory quadrature.
We address this critical issue via a new Trigonometric Quadrature Fourier Feature (TQFF) method, which uses a novel non-Gaussian quadrature rule.
arXiv Detail & Related papers (2023-10-23T03:53:09Z) - Group Equivariant Fourier Neural Operators for Partial Differential
Equations [33.71890280061319]
We consider solving partial differential equations with Fourier neural operators (FNOs)
In this work, we extend group convolutions to the frequency domain and design Fourier layers that are equivariant to rotations, translations, and reflections.
The resulting $G$-FNO architecture generalizes well across input resolutions and performs well in settings with varying levels of symmetry.
arXiv Detail & Related papers (2023-06-09T06:34:16Z) - Fourier expansion in variational quantum algorithms [1.4732811715354455]
We focus on the class of variational circuits, where constant gates are Clifford gates and parameterized gates are generated by Pauli operators.
We give a classical algorithm that computes coefficients of all trigonometric monomials up to a degree $m$ in time bounded by $mathcalO(N2m)$.
arXiv Detail & Related papers (2023-04-07T18:00:01Z) - Fourier Continuation for Exact Derivative Computation in
Physics-Informed Neural Operators [53.087564562565774]
PINO is a machine learning architecture that has shown promising empirical results for learning partial differential equations.
We present an architecture that leverages Fourier continuation (FC) to apply the exact gradient method to PINO for nonperiodic problems.
arXiv Detail & Related papers (2022-11-29T06:37:54Z) - Deep Fourier Up-Sampling [100.59885545206744]
Up-sampling in the Fourier domain is more challenging as it does not follow such a local property.
We propose a theoretically sound Deep Fourier Up-Sampling (FourierUp) to solve these issues.
arXiv Detail & Related papers (2022-10-11T06:17:31Z) - Unified Fourier-based Kernel and Nonlinearity Design for Equivariant
Networks on Homogeneous Spaces [52.424621227687894]
We introduce a unified framework for group equivariant networks on homogeneous spaces.
We take advantage of the sparsity of Fourier coefficients of the lifted feature fields.
We show that other methods treating features as the Fourier coefficients in the stabilizer subgroup are special cases of our activation.
arXiv Detail & Related papers (2022-06-16T17:59:01Z) - Fourier Disentangled Space-Time Attention for Aerial Video Recognition [54.80846279175762]
We present an algorithm, Fourier Activity Recognition (FAR), for UAV video activity recognition.
Our formulation uses a novel Fourier object disentanglement method to innately separate out the human agent from the background.
We have evaluated our approach on multiple UAV datasets including UAV Human RGB, UAV Human Night, Drone Action, and NEC Drone.
arXiv Detail & Related papers (2022-03-21T01:24:53Z) - Factorized Fourier Neural Operators [77.47313102926017]
The Factorized Fourier Neural Operator (F-FNO) is a learning-based method for simulating partial differential equations.
We show that our model maintains an error rate of 2% while still running an order of magnitude faster than a numerical solver.
arXiv Detail & Related papers (2021-11-27T03:34:13Z) - Fourier Neural Networks as Function Approximators and Differential
Equation Solvers [0.456877715768796]
The choice of activation and loss function yields results that replicate a Fourier series expansion closely.
We validate this FNN on naturally periodic smooth functions and on piecewise continuous periodic functions.
The main advantages of the current approach are the validity of the solution outside the training region, interpretability of the trained model, and simplicity of use.
arXiv Detail & Related papers (2020-05-27T00:30:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.