FAN: Fourier Analysis Networks
- URL: http://arxiv.org/abs/2410.02675v2
- Date: Sat, 09 Nov 2024 19:07:44 GMT
- Title: FAN: Fourier Analysis Networks
- Authors: Yihong Dong, Ge Li, Yongding Tao, Xue Jiang, Kechi Zhang, Jia Li, Jing Su, Jun Zhang, Jingjing Xu,
- Abstract summary: We propose FAN, which empowers the ability to efficiently model and reason about periodic phenomena.
We demonstrate the effectiveness of FAN in modeling and reasoning about periodic functions, and the superiority and generalizability of FAN across a range of real-world tasks.
- Score: 47.08787684221114
- License:
- Abstract: Despite the remarkable success achieved by neural networks, particularly those represented by MLP and Transformer, we reveal that they exhibit potential flaws in the modeling and reasoning of periodicity, i.e., they tend to memorize the periodic data rather than genuinely understanding the underlying principles of periodicity. However, periodicity is a crucial trait in various forms of reasoning and generalization, underpinning predictability across natural and engineered systems through recurring patterns in observations. In this paper, we propose FAN, a novel network architecture based on Fourier Analysis, which empowers the ability to efficiently model and reason about periodic phenomena. By introducing Fourier Series, the periodicity is naturally integrated into the structure and computational processes of the neural network, thus achieving a more accurate expression and prediction of periodic patterns. As a promising substitute to multi-layer perceptron (MLP), FAN can seamlessly replace MLP in various models with fewer parameters and FLOPs. Through extensive experiments, we demonstrate the effectiveness of FAN in modeling and reasoning about periodic functions, and the superiority and generalizability of FAN across a range of real-world tasks, including symbolic formula representation, time series forecasting, and language modeling.
Related papers
- Frequency-domain MLPs are More Effective Learners in Time Series
Forecasting [67.60443290781988]
Time series forecasting has played the key role in different industrial domains, including finance, traffic, energy, and healthcare.
Most-based forecasting methods suffer from the point-wise mappings and information bottleneck.
We propose FreTS, a simple yet effective architecture built upon Frequency-domains for Time Series forecasting.
arXiv Detail & Related papers (2023-11-10T17:05:13Z) - How neural networks learn to classify chaotic time series [77.34726150561087]
We study the inner workings of neural networks trained to classify regular-versus-chaotic time series.
We find that the relation between input periodicity and activation periodicity is key for the performance of LKCNN models.
arXiv Detail & Related papers (2023-06-04T08:53:27Z) - How Expressive are Spectral-Temporal Graph Neural Networks for Time
Series Forecasting? [46.95117922583253]
We establish a theoretical framework that unravels the expressive power of spectral-temporal GNNs.
Our results show that linear spectral-temporal GNNs are universal under mild assumptions.
We propose a simple instantiation named Temporal Graph GegenConv (TGC) which significantly outperforms most existing models.
arXiv Detail & Related papers (2023-05-11T05:56:38Z) - A Survey on Deep Learning based Time Series Analysis with Frequency
Transformation [74.3919960186696]
Frequency transformation (FT) has been increasingly incorporated into deep learning models to enhance state-of-the-art accuracy and efficiency in time series analysis.
Despite the growing attention and the proliferation of research in this emerging field, there is currently a lack of a systematic review and in-depth analysis of deep learning-based time series models with FT.
We present a comprehensive review that systematically investigates and summarizes the recent research advancements in deep learning-based time series analysis with FT.
arXiv Detail & Related papers (2023-02-04T14:33:07Z) - Dynamics of Fourier Modes in Torus Generative Adversarial Networks [0.8189696720657245]
Generative Adversarial Networks (GANs) are powerful Machine Learning models capable of generating fully synthetic samples of a desired phenomenon with a high resolution.
Despite their success, the training process of a GAN is highly unstable and typically it is necessary to implement several accessory perturbations to the networks to reach an acceptable convergence of the model.
We introduce a novel method to analyze the convergence and stability in the training of Generative Adversarial Networks.
arXiv Detail & Related papers (2022-09-05T09:03:22Z) - Bayesian Inference of Stochastic Dynamical Networks [0.0]
This paper presents a novel method for learning network topology and internal dynamics.
It is compared with group sparse Bayesian learning (GSBL), BINGO, kernel-based methods, dynGENIE3, GENIE3 and ARNI.
Our method achieves state-of-the-art performance compared with group sparse Bayesian learning (GSBL), BINGO, kernel-based methods, dynGENIE3, GENIE3 and ARNI.
arXiv Detail & Related papers (2022-06-02T03:22:34Z) - UNIPoint: Universally Approximating Point Processes Intensities [125.08205865536577]
We provide a proof that a class of learnable functions can universally approximate any valid intensity function.
We implement UNIPoint, a novel neural point process model, using recurrent neural networks to parameterise sums of basis function upon each event.
arXiv Detail & Related papers (2020-07-28T09:31:56Z) - Learned Factor Graphs for Inference from Stationary Time Sequences [107.63351413549992]
We propose a framework that combines model-based algorithms and data-driven ML tools for stationary time sequences.
neural networks are developed to separately learn specific components of a factor graph describing the distribution of the time sequence.
We present an inference algorithm based on learned stationary factor graphs, which learns to implement the sum-product scheme from labeled data.
arXiv Detail & Related papers (2020-06-05T07:06:19Z) - Fourier Neural Networks as Function Approximators and Differential
Equation Solvers [0.456877715768796]
The choice of activation and loss function yields results that replicate a Fourier series expansion closely.
We validate this FNN on naturally periodic smooth functions and on piecewise continuous periodic functions.
The main advantages of the current approach are the validity of the solution outside the training region, interpretability of the trained model, and simplicity of use.
arXiv Detail & Related papers (2020-05-27T00:30:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.