Unsupervised Graph Spectral Feature Denoising for Crop Yield Prediction
- URL: http://arxiv.org/abs/2208.02714v1
- Date: Thu, 4 Aug 2022 15:18:06 GMT
- Title: Unsupervised Graph Spectral Feature Denoising for Crop Yield Prediction
- Authors: Saghar Bagheri, Chinthaka Dinesh, Gene Cheung, Timothy Eadie
- Abstract summary: Prediction of annual crop yields at a county granularity is important for national food production and price stability.
We denoise relevant features via graph spectral filtering that are inputs to a deep learning prediction model.
Using denoised features as input, performance of a crop yield prediction model can be improved noticeably.
- Score: 27.604637365723676
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Prediction of annual crop yields at a county granularity is important for
national food production and price stability. In this paper, towards the goal
of better crop yield prediction, leveraging recent graph signal processing
(GSP) tools to exploit spatial correlation among neighboring counties, we
denoise relevant features via graph spectral filtering that are inputs to a
deep learning prediction model. Specifically, we first construct a
combinatorial graph with edge weights that encode county-to-county similarities
in soil and location features via metric learning. We then denoise features via
a maximum a posteriori (MAP) formulation with a graph Laplacian regularizer
(GLR). We focus on the challenge to estimate the crucial weight parameter
$\mu$, trading off the fidelity term and GLR, that is a function of noise
variance in an unsupervised manner. We first estimate noise variance directly
from noise-corrupted graph signals using a graph clique detection (GCD)
procedure that discovers locally constant regions. We then compute an optimal
$\mu$ minimizing an approximate mean square error function via bias-variance
analysis. Experimental results from collected USDA data show that using
denoised features as input, performance of a crop yield prediction model can be
improved noticeably.
Related papers
- Less is More: One-shot Subgraph Reasoning on Large-scale Knowledge Graphs [49.547988001231424]
We propose the one-shot-subgraph link prediction to achieve efficient and adaptive prediction.
Design principle is that, instead of directly acting on the whole KG, the prediction procedure is decoupled into two steps.
We achieve promoted efficiency and leading performances on five large-scale benchmarks.
arXiv Detail & Related papers (2024-03-15T12:00:12Z) - Graph Classification Gaussian Processes via Spectral Features [7.474662887810221]
Graph classification aims to categorise graphs based on their structure and node attributes.
In this work, we propose to tackle this task using tools from graph signal processing by deriving spectral features.
We show that even such a simple approach, having no learned parameters, can yield competitive performance compared to strong neural network and graph kernel baselines.
arXiv Detail & Related papers (2023-06-06T15:31:05Z) - Graph Sparsification for GCN Towards Optimal Crop Yield Predictions [27.415307133655407]
We propose a graph sparsification method based on the Fiedler number to remove edges from a complete graph kernel.
We show that our method produces a sparse graph with good GCN performance compared to other graph sparsification schemes in crop yield prediction.
arXiv Detail & Related papers (2023-06-02T17:51:56Z) - Graph Signal Sampling for Inductive One-Bit Matrix Completion: a
Closed-form Solution [112.3443939502313]
We propose a unified graph signal sampling framework which enjoys the benefits of graph signal analysis and processing.
The key idea is to transform each user's ratings on the items to a function (signal) on the vertices of an item-item graph.
For the online setting, we develop a Bayesian extension, i.e., BGS-IMC which considers continuous random Gaussian noise in the graph Fourier domain.
arXiv Detail & Related papers (2023-02-08T08:17:43Z) - Patch-level Gaze Distribution Prediction for Gaze Following [49.93340533068501]
We introduce the patch distribution prediction ( PDP) method for gaze following training.
We show that our model regularizes the MSE loss by predicting better heatmap distributions on images with larger annotation variances.
Experiments show that our model bridging the gap between the target prediction and in/out prediction subtasks, showing a significant improvement on both subtasks on public gaze following datasets.
arXiv Detail & Related papers (2022-11-20T19:25:15Z) - Large Graph Signal Denoising with Application to Differential Privacy [2.867517731896504]
We consider the case of signal denoising on graphs via a data-driven wavelet tight frame methodology.
We make it scalable to large graphs using Chebyshev-Jackson approximations.
A comprehensive performance analysis is carried out on graphs of varying size, from real and simulated data.
arXiv Detail & Related papers (2022-09-05T16:32:54Z) - From Spectral Graph Convolutions to Large Scale Graph Convolutional
Networks [0.0]
Graph Convolutional Networks (GCNs) have been shown to be a powerful concept that has been successfully applied to a large variety of tasks.
We study the theory that paved the way to the definition of GCN, including related parts of classical graph theory.
arXiv Detail & Related papers (2022-07-12T16:57:08Z) - Differentiable Annealed Importance Sampling and the Perils of Gradient
Noise [68.44523807580438]
Annealed importance sampling (AIS) and related algorithms are highly effective tools for marginal likelihood estimation.
Differentiability is a desirable property as it would admit the possibility of optimizing marginal likelihood as an objective.
We propose a differentiable algorithm by abandoning Metropolis-Hastings steps, which further unlocks mini-batch computation.
arXiv Detail & Related papers (2021-07-21T17:10:14Z) - Unrolling of Deep Graph Total Variation for Image Denoising [106.93258903150702]
In this paper, we combine classical graph signal filtering with deep feature learning into a competitive hybrid design.
We employ interpretable analytical low-pass graph filters and employ 80% fewer network parameters than state-of-the-art DL denoising scheme DnCNN.
arXiv Detail & Related papers (2020-10-21T20:04:22Z) - Neural Enhanced Belief Propagation on Factor Graphs [85.61562052281688]
A graphical model is a structured representation of locally dependent random variables.
We first extend graph neural networks to factor graphs (FG-GNN)
We then propose a new hybrid model that runs conjointly a FG-GNN with belief propagation.
arXiv Detail & Related papers (2020-03-04T11:03:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.