Domain-informed neural networks for interaction localization within
astroparticle experiments
- URL: http://arxiv.org/abs/2112.07995v1
- Date: Wed, 15 Dec 2021 09:42:04 GMT
- Title: Domain-informed neural networks for interaction localization within
astroparticle experiments
- Authors: Shixiao Liang, Aaron Higuera, Christina Peters, Venkat Roy, Waheed U.
Bajwa, Hagit Shatkay, Christopher D. Tunnell
- Abstract summary: This work proposes a domain-informed neural network architecture for experimental particle physics.
It uses particle interaction localization with the time-projection chamber (TPC) technology for dark matter research as an example application.
- Score: 6.157382820537719
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: This work proposes a domain-informed neural network architecture for
experimental particle physics, using particle interaction localization with the
time-projection chamber (TPC) technology for dark matter research as an example
application. A key feature of the signals generated within the TPC is that they
allow localization of particle interactions through a process called
reconstruction. While multilayer perceptrons (MLPs) have emerged as a leading
contender for reconstruction in TPCs, such a black-box approach does not
reflect prior knowledge of the underlying scientific processes. This paper
looks anew at neural network-based interaction localization and encodes prior
detector knowledge, in terms of both signal characteristics and detector
geometry, into the feature encoding and the output layers of a multilayer
neural network. The resulting Domain-informed Neural Network (DiNN limits the
receptive fields of the neurons in the initial feature encoding layers in order
to account for the spatially localized nature of the signals produced within
the TPC. This aspect of the DiNN, which has similarities with the emerging area
of graph neural networks in that the neurons in the initial layers only connect
to a handful of neurons in their succeeding layer, significantly reduces the
number of parameters in the network in comparison to an MLP. In addition, in
order to account for the detector geometry, the output layers of the network
are modified using two geometric transformations to ensure the DiNN produces
localizations within the interior of the detector. The end result is a neural
network architecture that has 60% fewer parameters than an MLP, but that still
achieves similar localization performance and provides a path to future
architectural developments with improved performance because of their ability
to encode additional domain knowledge into the architecture.
Related papers
- Topological Representations of Heterogeneous Learning Dynamics of Recurrent Spiking Neural Networks [16.60622265961373]
Spiking Neural Networks (SNNs) have become an essential paradigm in neuroscience and artificial intelligence.
Recent advances in literature have studied the network representations of deep neural networks.
arXiv Detail & Related papers (2024-03-19T05:37:26Z) - DYNAP-SE2: a scalable multi-core dynamic neuromorphic asynchronous
spiking neural network processor [2.9175555050594975]
We present a brain-inspired platform for prototyping real-time event-based Spiking Neural Networks (SNNs)
The system proposed supports the direct emulation of dynamic and realistic neural processing phenomena such as short-term plasticity, NMDA gating, AMPA diffusion, homeostasis, spike frequency adaptation, conductance-based dendritic compartments and spike transmission delays.
The flexibility to emulate different biologically plausible neural networks, and the chip's ability to monitor both population and single neuron signals in real-time, allow to develop and validate complex models of neural processing for both basic research and edge-computing applications.
arXiv Detail & Related papers (2023-10-01T03:48:16Z) - Parameter Convex Neural Networks [13.42851919291587]
We propose the exponential multilayer neural network (EMLP) which is convex with regard to the parameters of the neural network under some conditions.
For late experiments, we use the same architecture to make the exponential graph convolutional network (EGCN) and do the experiment on the graph classificaion dataset.
arXiv Detail & Related papers (2022-06-11T16:44:59Z) - Data-driven emergence of convolutional structure in neural networks [83.4920717252233]
We show how fully-connected neural networks solving a discrimination task can learn a convolutional structure directly from their inputs.
By carefully designing data models, we show that the emergence of this pattern is triggered by the non-Gaussian, higher-order local structure of the inputs.
arXiv Detail & Related papers (2022-02-01T17:11:13Z) - Hybrid SNN-ANN: Energy-Efficient Classification and Object Detection for
Event-Based Vision [64.71260357476602]
Event-based vision sensors encode local pixel-wise brightness changes in streams of events rather than image frames.
Recent progress in object recognition from event-based sensors has come from conversions of deep neural networks.
We propose a hybrid architecture for end-to-end training of deep neural networks for event-based pattern recognition and object detection.
arXiv Detail & Related papers (2021-12-06T23:45:58Z) - BioLCNet: Reward-modulated Locally Connected Spiking Neural Networks [0.6193838300896449]
We propose a spiking neural network (SNN) trained using spike-timing-dependent plasticity (STDP) and its reward-modulated variant (R-STDP) learning rules.
Our network consists of a rate-coded input layer followed by a locally connected hidden layer and a decoding output layer.
We used the MNIST dataset to obtain image classification accuracy and to assess the robustness of our rewarding system to varying target responses.
arXiv Detail & Related papers (2021-09-12T15:28:48Z) - An error-propagation spiking neural network compatible with neuromorphic
processors [2.432141667343098]
We present a spike-based learning method that approximates back-propagation using local weight update mechanisms.
We introduce a network architecture that enables synaptic weight update mechanisms to back-propagate error signals.
This work represents a first step towards the design of ultra-low power mixed-signal neuromorphic processing systems.
arXiv Detail & Related papers (2021-04-12T07:21:08Z) - Topological obstructions in neural networks learning [67.8848058842671]
We study global properties of the loss gradient function flow.
We use topological data analysis of the loss function and its Morse complex to relate local behavior along gradient trajectories with global properties of the loss surface.
arXiv Detail & Related papers (2020-12-31T18:53:25Z) - Multi-Tones' Phase Coding (MTPC) of Interaural Time Difference by
Spiking Neural Network [68.43026108936029]
We propose a pure spiking neural network (SNN) based computational model for precise sound localization in the noisy real-world environment.
We implement this algorithm in a real-time robotic system with a microphone array.
The experiment results show a mean error azimuth of 13 degrees, which surpasses the accuracy of the other biologically plausible neuromorphic approach for sound source localization.
arXiv Detail & Related papers (2020-07-07T08:22:56Z) - Modeling from Features: a Mean-field Framework for Over-parameterized
Deep Neural Networks [54.27962244835622]
This paper proposes a new mean-field framework for over- parameterized deep neural networks (DNNs)
In this framework, a DNN is represented by probability measures and functions over its features in the continuous limit.
We illustrate the framework via the standard DNN and the Residual Network (Res-Net) architectures.
arXiv Detail & Related papers (2020-07-03T01:37:16Z) - Progressive Tandem Learning for Pattern Recognition with Deep Spiking
Neural Networks [80.15411508088522]
Spiking neural networks (SNNs) have shown advantages over traditional artificial neural networks (ANNs) for low latency and high computational efficiency.
We propose a novel ANN-to-SNN conversion and layer-wise learning framework for rapid and efficient pattern recognition.
arXiv Detail & Related papers (2020-07-02T15:38:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.