Nonlinear Principal Component Analysis with Random Bernoulli Features for Process Monitoring
- URL: http://arxiv.org/abs/2503.12456v1
- Date: Sun, 16 Mar 2025 11:20:27 GMT
- Title: Nonlinear Principal Component Analysis with Random Bernoulli Features for Process Monitoring
- Authors: Ke Chen, Dandan Jiang,
- Abstract summary: We propose a novel random Bernoulli principal component analysis method to efficiently capture nonlinear patterns in the process.<n>We design four fast process monitoring methods based on random Bernoulli principal component analysis to extend its nonlinear capabilities.
- Score: 4.910232130159309
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The process generates substantial amounts of data with highly complex structures, leading to the development of numerous nonlinear statistical methods. However, most of these methods rely on computations involving large-scale dense kernel matrices. This dependence poses significant challenges in meeting the high computational demands and real-time responsiveness required by online monitoring systems. To alleviate the computational burden of dense large-scale matrix multiplication, we incorporate the bootstrap sampling concept into random feature mapping and propose a novel random Bernoulli principal component analysis method to efficiently capture nonlinear patterns in the process. We derive a convergence bound for the kernel matrix approximation constructed using random Bernoulli features, ensuring theoretical robustness. Subsequently, we design four fast process monitoring methods based on random Bernoulli principal component analysis to extend its nonlinear capabilities for handling diverse fault scenarios. Finally, numerical experiments and real-world data analyses are conducted to evaluate the performance of the proposed methods. Results demonstrate that the proposed methods offer excellent scalability and reduced computational complexity, achieving substantial cost savings with minimal performance loss compared to traditional kernel-based approaches.
Related papers
- Fast Shapley Value Estimation: A Unified Approach [71.92014859992263]
We propose a straightforward and efficient Shapley estimator, SimSHAP, by eliminating redundant techniques.
In our analysis of existing approaches, we observe that estimators can be unified as a linear transformation of randomly summed values from feature subsets.
Our experiments validate the effectiveness of our SimSHAP, which significantly accelerates the computation of accurate Shapley values.
arXiv Detail & Related papers (2023-11-02T06:09:24Z) - Distributionally Robust Model-based Reinforcement Learning with Large
State Spaces [55.14361269378122]
Three major challenges in reinforcement learning are the complex dynamical systems with large state spaces, the costly data acquisition processes, and the deviation of real-world dynamics from the training environment deployment.
We study distributionally robust Markov decision processes with continuous state spaces under the widely used Kullback-Leibler, chi-square, and total variation uncertainty sets.
We propose a model-based approach that utilizes Gaussian Processes and the maximum variance reduction algorithm to efficiently learn multi-output nominal transition dynamics.
arXiv Detail & Related papers (2023-09-05T13:42:11Z) - A neural network based model for multi-dimensional nonlinear Hawkes
processes [0.0]
We introduce the Neural Network for Descent Hawkes processes (NNNH), a non-parametric method based on neural networks to fit nonlinear Hawkes processes.
Our results highlight the effectiveness of the NNNH method in accurately capturing the complexities of nonlinear Hawkes processes.
arXiv Detail & Related papers (2023-03-06T12:31:19Z) - Spectral learning of Bernoulli linear dynamical systems models [21.3534487101893]
We develop a learning method for fast, efficient fitting of latent linear dynamical system models.
Our approach extends traditional subspace identification methods to the Bernoulli setting.
We show that the estimator provides real world settings by analyzing data from mice performing a sensory decision-making task.
arXiv Detail & Related papers (2023-03-03T16:29:12Z) - Monte Carlo Neural PDE Solver for Learning PDEs via Probabilistic Representation [59.45669299295436]
We propose a Monte Carlo PDE solver for training unsupervised neural solvers.
We use the PDEs' probabilistic representation, which regards macroscopic phenomena as ensembles of random particles.
Our experiments on convection-diffusion, Allen-Cahn, and Navier-Stokes equations demonstrate significant improvements in accuracy and efficiency.
arXiv Detail & Related papers (2023-02-10T08:05:19Z) - A Dynamical System View of Langevin-Based Non-Convex Sampling [44.002384711340966]
Non- sampling is a key challenge in machine learning, central to non-rate optimization in deep learning as well as to approximate its significance.
Existing guarantees typically only hold for the averaged distances rather than the more desirable last-rate iterates.
We develop a new framework that lifts the above issues by harnessing several tools from the theory systems.
arXiv Detail & Related papers (2022-10-25T09:43:36Z) - FaDIn: Fast Discretized Inference for Hawkes Processes with General
Parametric Kernels [82.53569355337586]
This work offers an efficient solution to temporal point processes inference using general parametric kernels with finite support.
The method's effectiveness is evaluated by modeling the occurrence of stimuli-induced patterns from brain signals recorded with magnetoencephalography (MEG)
Results show that the proposed approach leads to an improved estimation of pattern latency than the state-of-the-art.
arXiv Detail & Related papers (2022-10-10T12:35:02Z) - A Forward Backward Greedy approach for Sparse Multiscale Learning [0.0]
We propose a feature driven Reproducing Kernel Hilbert space (RKHS) for which the associated kernel has a weighted multiscale structure.
For generating approximations in this space, we provide a practical forward-backward algorithm that is shown to greedily construct a set of basis functions having a multiscale structure.
We analyze the performance of the approach on a variety of simulation and real data sets.
arXiv Detail & Related papers (2021-02-14T04:22:52Z) - Sparse PCA via $l_{2,p}$-Norm Regularization for Unsupervised Feature
Selection [138.97647716793333]
We propose a simple and efficient unsupervised feature selection method, by combining reconstruction error with $l_2,p$-norm regularization.
We present an efficient optimization algorithm to solve the proposed unsupervised model, and analyse the convergence and computational complexity of the algorithm theoretically.
arXiv Detail & Related papers (2020-12-29T04:08:38Z) - Progressive Batching for Efficient Non-linear Least Squares [31.082253632197023]
Most improvements of the basic Gauss-Newton tackle convergence guarantees or leverage the sparsity of the underlying problem structure for computational speedup.
Our work borrows ideas from both machine learning and statistics, and we present an approach for non-linear least-squares that guarantees convergence while at the same time significantly reduces the required amount of computation.
arXiv Detail & Related papers (2020-10-21T13:00:04Z) - Effective Proximal Methods for Non-convex Non-smooth Regularized
Learning [27.775096437736973]
We show that the independent sampling scheme tends to improve performance of the commonly-used uniform sampling scheme.
Our new analysis also derives a speed for the sampling than best one available so far.
arXiv Detail & Related papers (2020-09-14T16:41:32Z) - Multiplicative noise and heavy tails in stochastic optimization [62.993432503309485]
empirical optimization is central to modern machine learning, but its role in its success is still unclear.
We show that it commonly arises in parameters of discrete multiplicative noise due to variance.
A detailed analysis is conducted in which we describe on key factors, including recent step size, and data, all exhibit similar results on state-of-the-art neural network models.
arXiv Detail & Related papers (2020-06-11T09:58:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.