A Sieve Quasi-likelihood Ratio Test for Neural Networks with
Applications to Genetic Association Studies
- URL: http://arxiv.org/abs/2212.08255v1
- Date: Fri, 16 Dec 2022 02:54:46 GMT
- Title: A Sieve Quasi-likelihood Ratio Test for Neural Networks with
Applications to Genetic Association Studies
- Authors: Xiaoxi Shen, Chang Jiang, Lyudmila Sakhanenko and Qing Lu
- Abstract summary: We propose a quasi sieve-likelihood ratio test based on NN with one hidden layer for testing complex associations.
The validity of the distribution is investigated via simulations.
We demonstrate the use of the proposed test by performing a genetic association analysis of the sequencing data from Alzheimer's Disease Neuroimaging Initiative (ADNI)
- Score: 0.32771631221674324
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Neural networks (NN) play a central role in modern Artificial intelligence
(AI) technology and has been successfully used in areas such as natural
language processing and image recognition. While majority of NN applications
focus on prediction and classification, there are increasing interests in
studying statistical inference of neural networks. The study of NN statistical
inference can enhance our understanding of NN statistical proprieties.
Moreover, it can facilitate the NN-based hypothesis testing that can be applied
to hypothesis-driven clinical and biomedical research. In this paper, we
propose a sieve quasi-likelihood ratio test based on NN with one hidden layer
for testing complex associations. The test statistic has asymptotic chi-squared
distribution, and therefore it is computationally efficient and easy for
implementation in real data analysis. The validity of the asymptotic
distribution is investigated via simulations. Finally, we demonstrate the use
of the proposed test by performing a genetic association analysis of the
sequencing data from Alzheimer's Disease Neuroimaging Initiative (ADNI).
Related papers
- Statistical tuning of artificial neural network [0.0]
This study introduces methods to enhance the understanding of neural networks, focusing specifically on models with a single hidden layer.
We propose statistical tests to assess the significance of input neurons and introduce algorithms for dimensionality reduction.
This research advances the field of Explainable Artificial Intelligence by presenting robust statistical frameworks for interpreting neural networks.
arXiv Detail & Related papers (2024-09-24T19:47:03Z) - Training Guarantees of Neural Network Classification Two-Sample Tests by Kernel Analysis [58.435336033383145]
We construct and analyze a neural network two-sample test to determine whether two datasets came from the same distribution.
We derive the theoretical minimum training time needed to ensure the NTK two-sample test detects a deviation-level between the datasets.
We show that the statistical power associated with the neural network two-sample test goes to 1 as the neural network training samples and test evaluation samples go to infinity.
arXiv Detail & Related papers (2024-07-05T18:41:16Z) - Provably Neural Active Learning Succeeds via Prioritizing Perplexing Samples [53.95282502030541]
Neural Network-based active learning (NAL) is a cost-effective data selection technique that utilizes neural networks to select and train on a small subset of samples.
We try to move one step forward by offering a unified explanation for the success of both query criteria-based NAL from a feature learning view.
arXiv Detail & Related papers (2024-06-06T10:38:01Z) - An Association Test Based on Kernel-Based Neural Networks for Complex
Genetic Association Analysis [0.8221435109014762]
We develop a kernel-based neural network model (KNN) that synergizes the strengths of linear mixed models with conventional neural networks.
MINQUE-based test to assess the joint association of genetic variants with the phenotype.
Two additional tests to evaluate and interpret linear and non-linear/non-additive genetic effects.
arXiv Detail & Related papers (2023-12-06T05:02:28Z) - A Kernel-Based Neural Network Test for High-dimensional Sequencing Data
Analysis [0.8221435109014762]
We introduce a new kernel-based neural network (KNN) test for complex association analysis of sequencing data.
Based on KNN, a Wald-type test is then introduced to evaluate the joint association of high-dimensional genetic data with a disease phenotype of interest.
arXiv Detail & Related papers (2023-12-05T16:06:23Z) - SIT: A Bionic and Non-Linear Neuron for Spiking Neural Network [12.237928453571636]
Spiking Neural Networks (SNNs) have piqued researchers' interest because of their capacity to process temporal information and low power consumption.
Current state-of-the-art methods limited their biological plausibility and performance because their neurons are generally built on the simple Leaky-Integrate-and-Fire (LIF) model.
Due to the high level of dynamic complexity, modern neuron models have seldom been implemented in SNN practice.
arXiv Detail & Related papers (2022-03-30T07:50:44Z) - FF-NSL: Feed-Forward Neural-Symbolic Learner [70.978007919101]
This paper introduces a neural-symbolic learning framework, called Feed-Forward Neural-Symbolic Learner (FF-NSL)
FF-NSL integrates state-of-the-art ILP systems based on the Answer Set semantics, with neural networks, in order to learn interpretable hypotheses from labelled unstructured data.
arXiv Detail & Related papers (2021-06-24T15:38:34Z) - Statistical model-based evaluation of neural networks [74.10854783437351]
We develop an experimental setup for the evaluation of neural networks (NNs)
The setup helps to benchmark a set of NNs vis-a-vis minimum-mean-square-error (MMSE) performance bounds.
This allows us to test the effects of training data size, data dimension, data geometry, noise, and mismatch between training and testing conditions.
arXiv Detail & Related papers (2020-11-18T00:33:24Z) - Towards Interaction Detection Using Topological Analysis on Neural
Networks [55.74562391439507]
In neural networks, any interacting features must follow a strongly weighted connection to common hidden units.
We propose a new measure for quantifying interaction strength, based upon the well-received theory of persistent homology.
A Persistence Interaction detection(PID) algorithm is developed to efficiently detect interactions.
arXiv Detail & Related papers (2020-10-25T02:15:24Z) - Quantifying Statistical Significance of Neural Network-based Image
Segmentation by Selective Inference [23.97765106673937]
We use a conditional selective inference (SI) framework to compute exact (non-asymptotic) valid p-values for the segmentation results.
Our proposed method can successfully control the false positive rate, has good performance in terms of computational efficiency, and provides good results when applied to medical image data.
arXiv Detail & Related papers (2020-10-05T07:16:40Z) - Provably Efficient Neural Estimation of Structural Equation Model: An
Adversarial Approach [144.21892195917758]
We study estimation in a class of generalized Structural equation models (SEMs)
We formulate the linear operator equation as a min-max game, where both players are parameterized by neural networks (NNs), and learn the parameters of these neural networks using a gradient descent.
For the first time we provide a tractable estimation procedure for SEMs based on NNs with provable convergence and without the need for sample splitting.
arXiv Detail & Related papers (2020-07-02T17:55:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.