Latent Tensor Factorization with Nonlinear PID Control for Missing Data Recovery in Non-Intrusive Load Monitoring
- URL: http://arxiv.org/abs/2504.13483v1
- Date: Fri, 18 Apr 2025 05:48:14 GMT
- Title: Latent Tensor Factorization with Nonlinear PID Control for Missing Data Recovery in Non-Intrusive Load Monitoring
- Authors: Yiran Wang, Tangtang Xie, Hao Wu,
- Abstract summary: Non-Intrusive Load Monitoring (NILM) has emerged as a key smart grid technology.<n>This paper proposes a Proportional-integral-derivative (PID)-Incorporated Latent factorization of tensors (NPIL) model with two-fold ideas.<n> Experimental results on real-world NILM datasets demonstrate that the proposed NPIL model surpasses state-of-the-art models in convergence rate and accuracy when predicting the missing NILM data.
- Score: 2.94258758663678
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Non-Intrusive Load Monitoring (NILM) has emerged as a key smart grid technology, identifying electrical device and providing detailed energy consumption data for precise demand response management. Nevertheless, NILM data suffers from missing values due to inescapable factors like sensor failure, leading to inaccuracies in non-intrusive load monitoring. A stochastic gradient descent (SGD)-based latent factorization of tensors model has proven to be effective in estimating missing data, however, it updates a latent factor solely based on the current stochastic gradient, without considering past information, which leads to slow convergence of anLFT model. To address this issue, this paper proposes a Nonlinear Proportional-integral-derivative (PID)-Incorporated Latent factorization of tensors (NPIL) model with two-fold ideas: a) rebuilding the instant learning error according to the principle of a nonlinear PID controller, thus, the past update information is efficiently incorporated into the learning scheme, and b) implementing gain parameter adaptation by utilizing particle swarm optimization (PSO) algorithm, hence, the model computational efficiency is effectively improved. Experimental results on real-world NILM datasets demonstrate that the proposed NPIL model surpasses state-of-the-art models in convergence rate and accuracy when predicting the missing NILM data.
Related papers
- Interpretable Deep Regression Models with Interval-Censored Failure Time Data [1.2993568435938014]
Deep learning methods for interval-censored data remain underexplored and limited to specific data type or model.<n>This work proposes a general regression framework for interval-censored data with a broad class of partially linear transformation models.<n>Applying our method to the Alzheimer's Disease Neuroimaging Initiative dataset yields novel insights and improved predictive performance compared to traditional approaches.
arXiv Detail & Related papers (2025-03-25T15:27:32Z) - Bayesian Model Parameter Learning in Linear Inverse Problems with Application in EEG Focal Source Imaging [49.1574468325115]
Inverse problems can be described as limited-data problems in which the signal of interest cannot be observed directly.<n>We studied a linear inverse problem that included an unknown non-linear model parameter.<n>We utilized a Bayesian model-based learning approach that allowed signal recovery and subsequently estimation of the model parameter.
arXiv Detail & Related papers (2025-01-07T18:14:24Z) - Low-rank finetuning for LLMs: A fairness perspective [54.13240282850982]
Low-rank approximation techniques have become the de facto standard for fine-tuning Large Language Models.
This paper investigates the effectiveness of these methods in capturing the shift of fine-tuning datasets from the initial pre-trained data distribution.
We show that low-rank fine-tuning inadvertently preserves undesirable biases and toxic behaviors.
arXiv Detail & Related papers (2024-05-28T20:43:53Z) - A PID-Controlled Non-Negative Tensor Factorization Model for Analyzing Missing Data in NILM [0.0]
Non-Intrusive Load Monitoring (NILM) has become an essential tool in smart grid and energy management.
Traditional imputation methods, such as linear and matrix factorization, struggle with nonlinear relationships and are sensitive to sparse data.
This paper proposes a Proportional-Integral-Derivative (PID) Non-Negative Latent Factorization of tensor (PNLF) model, which dynamically adjusts parameter gradients to improve convergence, stability, and accuracy.
arXiv Detail & Related papers (2024-03-09T10:01:49Z) - A PAC-Bayesian Perspective on the Interpolating Information Criterion [54.548058449535155]
We show how a PAC-Bayes bound is obtained for a general class of models, characterizing factors which influence performance in the interpolating regime.
We quantify how the test error for overparameterized models achieving effectively zero training error depends on the quality of the implicit regularization imposed by e.g. the combination of model, parameter-initialization scheme.
arXiv Detail & Related papers (2023-11-13T01:48:08Z) - Fast Latent Factor Analysis via a Fuzzy PID-Incorporated Stochastic
Gradient Descent Algorithm [1.984879854062214]
A gradient descent (SGD)-based latent factor analysis model is remarkably effective in extracting valuable information from an HDI matrix.
A standard SGD algorithm learns a latent factor relying on the gradient of current instance error only without considering past update information.
This paper proposes a Fuzzy PID-incorporated SGD algorithm with two-fold ideas: 1) rebuilding the instance error by considering the past update information in an efficient way following the principle of PID, and 2) implementing hyper-learnings and gain adaptation following the fuzzy rules.
arXiv Detail & Related papers (2023-03-07T14:51:09Z) - A Nonlinear PID-Enhanced Adaptive Latent Factor Analysis Model [6.2303427193075755]
High-dimensional and incomplete (HDI) data holds tremendous interactive information in various industrial applications.
A latent factor (LF) model is remarkably effective in extracting valuable information from HDI data with decent gradient (SGD) algorithm.
An SGD-based LFA model suffers from slow convergence since it only considers the current learning error.
arXiv Detail & Related papers (2022-08-04T07:48:19Z) - Truncated tensor Schatten p-norm based approach for spatiotemporal
traffic data imputation with complicated missing patterns [77.34726150561087]
We introduce four complicated missing patterns, including missing and three fiber-like missing cases according to the mode-drivenn fibers.
Despite nonity of the objective function in our model, we derive the optimal solutions by integrating alternating data-mputation method of multipliers.
arXiv Detail & Related papers (2022-05-19T08:37:56Z) - PI-NLF: A Proportional-Integral Approach for Non-negative Latent Factor
Analysis [9.087387628717952]
A non-negative latent factor (NLF) model performs efficient representation learning to an HDI matrix.
A PI-NLF model outperforms the state-of-the-art models in both computational efficiency and estimation accuracy for missing data of an HDI matrix.
arXiv Detail & Related papers (2022-05-05T12:04:52Z) - Improving Generalization via Uncertainty Driven Perturbations [107.45752065285821]
We consider uncertainty-driven perturbations of the training data points.
Unlike loss-driven perturbations, uncertainty-guided perturbations do not cross the decision boundary.
We show that UDP is guaranteed to achieve the robustness margin decision on linear models.
arXiv Detail & Related papers (2022-02-11T16:22:08Z) - Imputation-Free Learning from Incomplete Observations [73.15386629370111]
We introduce the importance of guided gradient descent (IGSGD) method to train inference from inputs containing missing values without imputation.
We employ reinforcement learning (RL) to adjust the gradients used to train the models via back-propagation.
Our imputation-free predictions outperform the traditional two-step imputation-based predictions using state-of-the-art imputation methods.
arXiv Detail & Related papers (2021-07-05T12:44:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.