Autism Disease Detection Using Transfer Learning Techniques: Performance
Comparison Between Central Processing Unit vs Graphics Processing Unit
Functions for Neural Networks
- URL: http://arxiv.org/abs/2306.00283v1
- Date: Thu, 1 Jun 2023 01:59:17 GMT
- Title: Autism Disease Detection Using Transfer Learning Techniques: Performance
Comparison Between Central Processing Unit vs Graphics Processing Unit
Functions for Neural Networks
- Authors: Mst Shapna Akter, Hossain Shahriar, Alfredo Cuzzocrea
- Abstract summary: We implement a system for classifying Autism disease using face images of autistic and non-autistic children to compare performance.
It was observed that GPU outperformed CPU in all tests conducted.
- Score: 2.750124853532831
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Neural network approaches are machine learning methods that are widely used
in various domains, such as healthcare and cybersecurity. Neural networks are
especially renowned for their ability to deal with image datasets. During the
training process with images, various fundamental mathematical operations are
performed in the neural network. These operations include several algebraic and
mathematical functions, such as derivatives, convolutions, and matrix
inversions and transpositions. Such operations demand higher processing power
than what is typically required for regular computer usage. Since CPUs are
built with serial processing, they are not appropriate for handling large image
datasets. On the other hand, GPUs have parallel processing capabilities and can
provide higher speed. This paper utilizes advanced neural network techniques,
such as VGG16, Resnet50, Densenet, Inceptionv3, Xception, Mobilenet, XGBOOST
VGG16, and our proposed models, to compare CPU and GPU resources. We
implemented a system for classifying Autism disease using face images of
autistic and non-autistic children to compare performance during testing. We
used evaluation matrices such as Accuracy, F1 score, Precision, Recall, and
Execution time. It was observed that GPU outperformed CPU in all tests
conducted. Moreover, the performance of the neural network models in terms of
accuracy increased on GPU compared to CPU.
Related papers
- Machine learning based biomedical image processing for echocardiographic
images [0.0]
The proposed method uses K-Nearest Neighbor (KNN) algorithm for segmentation of medical images.
The trained neural network has been tested successfully on a group of echocardiographic images.
arXiv Detail & Related papers (2023-03-16T06:23:43Z) - Intelligence Processing Units Accelerate Neuromorphic Learning [52.952192990802345]
Spiking neural networks (SNNs) have achieved orders of magnitude improvement in terms of energy consumption and latency.
We present an IPU-optimized release of our custom SNN Python package, snnTorch.
arXiv Detail & Related papers (2022-11-19T15:44:08Z) - Benchmarking GPU and TPU Performance with Graph Neural Networks [0.0]
This work analyzes and compares the GPU and TPU performance training a Graph Neural Network (GNN) developed to solve a real-life pattern recognition problem.
Characterizing the new class of models acting on sparse data may prove helpful in optimizing the design of deep learning libraries and future AI accelerators.
arXiv Detail & Related papers (2022-10-21T21:03:40Z) - A Neural Network Based Method with Transfer Learning for Genetic Data
Analysis [3.8599966694228667]
We combine transfer learning technique with a neural network based method(expectile neural networks)
We leverage previous learnings and avoid starting from scratch to improve the model performance.
By using transfer learning algorithm, the performance of expectile neural networks is improved compared to expectile neural network without using transfer learning technique.
arXiv Detail & Related papers (2022-06-20T16:16:05Z) - Content-Aware Convolutional Neural Networks [98.97634685964819]
Convolutional Neural Networks (CNNs) have achieved great success due to the powerful feature learning ability of convolution layers.
We propose a Content-aware Convolution (CAC) that automatically detects the smooth windows and applies a 1x1 convolutional kernel to replace the original large kernel.
arXiv Detail & Related papers (2021-06-30T03:54:35Z) - HistoTransfer: Understanding Transfer Learning for Histopathology [9.231495418218813]
We compare the performance of features extracted from networks trained on ImageNet and histopathology data.
We investigate if features learned using more complex networks lead to gain in performance.
arXiv Detail & Related papers (2021-06-13T18:55:23Z) - A Framework for Fast Scalable BNN Inference using Googlenet and Transfer
Learning [0.0]
This thesis aims to achieve high accuracy in object detection with good real-time performance.
The binarized neural network has shown high performance in various vision tasks such as image classification, object detection, and semantic segmentation.
Results show that the accuracy of objects detected by the transfer learning method is more when compared to the existing methods.
arXiv Detail & Related papers (2021-01-04T06:16:52Z) - CNNs for JPEGs: A Study in Computational Cost [49.97673761305336]
Convolutional neural networks (CNNs) have achieved astonishing advances over the past decade.
CNNs are capable of learning robust representations of the data directly from the RGB pixels.
Deep learning methods capable of learning directly from the compressed domain have been gaining attention in recent years.
arXiv Detail & Related papers (2020-12-26T15:00:10Z) - Reservoir Memory Machines as Neural Computers [70.5993855765376]
Differentiable neural computers extend artificial neural networks with an explicit memory without interference.
We achieve some of the computational capabilities of differentiable neural computers with a model that can be trained very efficiently.
arXiv Detail & Related papers (2020-09-14T12:01:30Z) - NAS-DIP: Learning Deep Image Prior with Neural Architecture Search [65.79109790446257]
Recent work has shown that the structure of deep convolutional neural networks can be used as a structured image prior.
We propose to search for neural architectures that capture stronger image priors.
We search for an improved network by leveraging an existing neural architecture search algorithm.
arXiv Detail & Related papers (2020-08-26T17:59:36Z) - Spiking Neural Networks Hardware Implementations and Challenges: a
Survey [53.429871539789445]
Spiking Neural Networks are cognitive algorithms mimicking neuron and synapse operational principles.
We present the state of the art of hardware implementations of spiking neural networks.
We discuss the strategies employed to leverage the characteristics of these event-driven algorithms at the hardware level.
arXiv Detail & Related papers (2020-05-04T13:24:00Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.