Localized convolutional neural networks for geospatial wind forecasting
- URL: http://arxiv.org/abs/2005.05930v3
- Date: Fri, 10 Jul 2020 16:13:17 GMT
- Title: Localized convolutional neural networks for geospatial wind forecasting
- Authors: Arnas Uselis, Mantas Luko\v{s}evi\v{c}ius, Lukas Stasytis
- Abstract summary: Convolutional Neural Networks (CNN) possess positive qualities when it comes to many spatial data.
In this work, we propose localized convolutional neural networks that enable CNNs to learn local features in addition to the global ones.
They can be added to any convolutional layers, easily end-to-end trained, introduce minimal additional complexity, and let CNNs retain most of their benefits to the extent that they are needed.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Convolutional Neural Networks (CNN) possess many positive qualities when it
comes to spatial raster data. Translation invariance enables CNNs to detect
features regardless of their position in the scene. However, in some domains,
like geospatial, not all locations are exactly equal. In this work, we propose
localized convolutional neural networks that enable convolutional architectures
to learn local features in addition to the global ones. We investigate their
instantiations in the form of learnable inputs, local weights, and a more
general form. They can be added to any convolutional layers, easily end-to-end
trained, introduce minimal additional complexity, and let CNNs retain most of
their benefits to the extent that they are needed. In this work we address
spatio-temporal prediction: test the effectiveness of our methods on a
synthetic benchmark dataset and tackle three real-world wind prediction
datasets. For one of them, we propose a method to spatially order the unordered
data. We compare the recent state-of-the-art spatio-temporal prediction models
on the same data. Models that use convolutional layers can be and are extended
with our localizations. In all these cases our extensions improve the results,
and thus often the state-of-the-art. We share all the code at a public
repository.
Related papers
- Set-based Neural Network Encoding Without Weight Tying [91.37161634310819]
We propose a neural network weight encoding method for network property prediction.
Our approach is capable of encoding neural networks in a model zoo of mixed architecture.
We introduce two new tasks for neural network property prediction: cross-dataset and cross-architecture.
arXiv Detail & Related papers (2023-05-26T04:34:28Z) - Predicting COVID-19 pandemic by spatio-temporal graph neural networks: A
New Zealand's study [16.3773496061049]
We propose a novel deep learning architecture named Attention-based Multiresolution Graph Neural Networks (ATMGNN)
Our method can capture the multiscale structures of the spatial graph via a learning to cluster algorithm in a data-driven manner.
For a future work, we plan to extend our work for real-time prediction and global scale.
arXiv Detail & Related papers (2023-05-12T19:00:17Z) - Improved Convergence Guarantees for Shallow Neural Networks [91.3755431537592]
We prove convergence of depth 2 neural networks, trained via gradient descent, to a global minimum.
Our model has the following features: regression with quadratic loss function, fully connected feedforward architecture, RelU activations, Gaussian data instances, adversarial labels.
They strongly suggest that, at least in our model, the convergence phenomenon extends well beyond the NTK regime''
arXiv Detail & Related papers (2022-12-05T14:47:52Z) - Universal Approximation Property of Fully Convolutional Neural Networks
with Zero Padding [10.295288663157393]
CNNs function as tensor-to-tensor mappings, preserving the spatial structure of input data.
We show that CNNs can approximate arbitrary continuous functions in cases where both the input and output values exhibit the same spatial shape.
We also verify that deep, narrow CNNs possess the UAP as tensor-to-tensor functions.
arXiv Detail & Related papers (2022-11-18T02:04:16Z) - What Can Be Learnt With Wide Convolutional Neural Networks? [69.55323565255631]
We study infinitely-wide deep CNNs in the kernel regime.
We prove that deep CNNs adapt to the spatial scale of the target function.
We conclude by computing the generalisation error of a deep CNN trained on the output of another deep CNN.
arXiv Detail & Related papers (2022-08-01T17:19:32Z) - Focal Sparse Convolutional Networks for 3D Object Detection [121.45950754511021]
We introduce two new modules to enhance the capability of Sparse CNNs.
They are focal sparse convolution (Focals Conv) and its multi-modal variant of focal sparse convolution with fusion.
For the first time, we show that spatially learnable sparsity in sparse convolution is essential for sophisticated 3D object detection.
arXiv Detail & Related papers (2022-04-26T17:34:10Z) - Local Augmentation for Graph Neural Networks [78.48812244668017]
We introduce the local augmentation, which enhances node features by its local subgraph structures.
Based on the local augmentation, we further design a novel framework: LA-GNN, which can apply to any GNN models in a plug-and-play manner.
arXiv Detail & Related papers (2021-09-08T18:10:08Z) - Deep Parametric Continuous Convolutional Neural Networks [92.87547731907176]
Parametric Continuous Convolution is a new learnable operator that operates over non-grid structured data.
Our experiments show significant improvement over the state-of-the-art in point cloud segmentation of indoor and outdoor scenes.
arXiv Detail & Related papers (2021-01-17T18:28:23Z) - Wireless Localisation in WiFi using Novel Deep Architectures [4.541069830146568]
This paper studies the indoor localisation of WiFi devices based on a commodity chipset and standard channel sounding.
We present a novel shallow neural network (SNN) in which features are extracted from the channel state information corresponding to WiFi subcarriers received on different antennas.
arXiv Detail & Related papers (2020-10-16T22:48:29Z) - PushNet: Efficient and Adaptive Neural Message Passing [1.9121961872220468]
Message passing neural networks have recently evolved into a state-of-the-art approach to representation learning on graphs.
Existing methods perform synchronous message passing along all edges in multiple subsequent rounds.
We consider a novel asynchronous message passing approach where information is pushed only along the most relevant edges until convergence.
arXiv Detail & Related papers (2020-03-04T18:15:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.