nbi: the Astronomer's Package for Neural Posterior Estimation
- URL: http://arxiv.org/abs/2312.03824v2
- Date: Thu, 21 Dec 2023 20:37:57 GMT
- Title: nbi: the Astronomer's Package for Neural Posterior Estimation
- Authors: Keming Zhang, Joshua S. Bloom, St\'efan van der Walt, Nina Hernitschek
- Abstract summary: We introduce a new framework and open-source software nbi (Neural Bayesian Inference), which supports both amortized and sequential NPE.
First, nbi provides built-in "featurizer" networks with efficacy on sequential data, such as light curve and spectra, thus obviating the need for this customization on the user end.
Second, we introduce a modified algorithm SNPE-IS, which facilities exact posterior inference by using the posterior under NPE only as a proposal distribution for importance sampling.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Despite the promise of Neural Posterior Estimation (NPE) methods in
astronomy, the adaptation of NPE into the routine inference workflow has been
slow. We identify three critical issues: the need for custom featurizer
networks tailored to the observed data, the inference inexactness, and the
under-specification of physical forward models. To address the first two
issues, we introduce a new framework and open-source software nbi (Neural
Bayesian Inference), which supports both amortized and sequential NPE. First,
nbi provides built-in "featurizer" networks with demonstrated efficacy on
sequential data, such as light curve and spectra, thus obviating the need for
this customization on the user end. Second, we introduce a modified algorithm
SNPE-IS, which facilities asymptotically exact inference by using the surrogate
posterior under NPE only as a proposal distribution for importance sampling.
These features allow nbi to be applied off-the-shelf to astronomical inference
problems involving light curves and spectra. We discuss how nbi may serve as an
effective alternative to existing methods such as Nested Sampling. Our package
is at https://github.com/kmzzhang/nbi.
Related papers
- Stellar Spectra Fitting with Amortized Neural Posterior Estimation and
nbi [0.0]
We train an ANPE model for the APOGEE survey and demonstrate its efficacy on both mock and real stellar spectra.
We introduce an effective approach to handling the measurement noise properties inherent in spectral data.
We discuss the utility of an ANPE "model zoo," where models are trained for specific instruments and distributed under the nbi framework.
arXiv Detail & Related papers (2023-12-09T21:30:07Z) - Accelerating Scalable Graph Neural Network Inference with Node-Adaptive
Propagation [80.227864832092]
Graph neural networks (GNNs) have exhibited exceptional efficacy in a diverse array of applications.
The sheer size of large-scale graphs presents a significant challenge to real-time inference with GNNs.
We propose an online propagation framework and two novel node-adaptive propagation methods.
arXiv Detail & Related papers (2023-10-17T05:03:00Z) - Sparse Function-space Representation of Neural Networks [23.4128813752424]
Deep neural networks (NNs) are known to lack uncertainty estimates and struggle to incorporate new data.
We present a method that mitigates these issues by converting NNs from weight space to function space, via a dual parameterization.
arXiv Detail & Related papers (2023-09-05T12:56:35Z) - Benign Overfitting in Deep Neural Networks under Lazy Training [72.28294823115502]
We show that when the data distribution is well-separated, DNNs can achieve Bayes-optimal test error for classification.
Our results indicate that interpolating with smoother functions leads to better generalization.
arXiv Detail & Related papers (2023-05-30T19:37:44Z) - Efficient Bayesian inference using physics-informed invertible neural
networks for inverse problems [6.97393424359704]
We introduce an innovative approach for addressing Bayesian inverse problems through the utilization of physics-informed invertible neural networks (PI-INN)
The PI-INN offers a precise and efficient generative model for Bayesian inverse problems, yielding tractable posterior density estimates.
As a particular physics-informed deep learning model, the primary training challenge for PI-INN centers on enforcing the independence constraint.
arXiv Detail & Related papers (2023-04-25T03:17:54Z) - Efficient Graph Neural Network Inference at Large Scale [54.89457550773165]
Graph neural networks (GNNs) have demonstrated excellent performance in a wide range of applications.
Existing scalable GNNs leverage linear propagation to preprocess the features and accelerate the training and inference procedure.
We propose a novel adaptive propagation order approach that generates the personalized propagation order for each node based on its topological information.
arXiv Detail & Related papers (2022-11-01T14:38:18Z) - Sample-Then-Optimize Batch Neural Thompson Sampling [50.800944138278474]
We introduce two algorithms for black-box optimization based on the Thompson sampling (TS) policy.
To choose an input query, we only need to train an NN and then choose the query by maximizing the trained NN.
Our algorithms sidestep the need to invert the large parameter matrix yet still preserve the validity of the TS policy.
arXiv Detail & Related papers (2022-10-13T09:01:58Z) - Random Features for the Neural Tangent Kernel [57.132634274795066]
We propose an efficient feature map construction of the Neural Tangent Kernel (NTK) of fully-connected ReLU network.
We show that dimension of the resulting features is much smaller than other baseline feature map constructions to achieve comparable error bounds both in theory and practice.
arXiv Detail & Related papers (2021-04-03T09:08:12Z) - Deep Networks for Direction-of-Arrival Estimation in Low SNR [89.45026632977456]
We introduce a Convolutional Neural Network (CNN) that is trained from mutli-channel data of the true array manifold matrix.
We train a CNN in the low-SNR regime to predict DoAs across all SNRs.
Our robust solution can be applied in several fields, ranging from wireless array sensors to acoustic microphones or sonars.
arXiv Detail & Related papers (2020-11-17T12:52:18Z) - A Neural Network Approach for Online Nonlinear Neyman-Pearson
Classification [3.6144103736375857]
We propose a novel Neyman-Pearson (NP) classifier that is both online and nonlinear as the first time in the literature.
The proposed classifier operates on a binary labeled data stream in an online manner, and maximizes the detection power about a user-specified and controllable false positive rate.
Our algorithm is appropriate for large scale data applications and provides a decent false positive rate controllability with real time processing.
arXiv Detail & Related papers (2020-06-14T20:00:25Z) - Revisiting Saliency Metrics: Farthest-Neighbor Area Under Curve [23.334584322129142]
Saliency detection has been widely studied because it plays an important role in various vision applications.
It is difficult to evaluate saliency systems because each measure has its own bias.
We propose a new saliency metric based on the AUC property, which aims at sampling a more directional negative set for evaluation.
arXiv Detail & Related papers (2020-02-24T20:55:42Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.